Network Security Group Azure Network Security Group is used to manage the flow of the network traffic and the direction as well, besides the default inbound and outbound security rules there can be none or many security rules to define the security within in the Azure Virtual Network. Purpose of copying Security Rules There are many scenarios where you need to clone Network Security Group and its security rules to a new Network Security Group or copy the security rules to an existing Network Security Group, it could be as part of the migration, testing, cloning the same security measures for different project, or for a disaster recovery site and etc., You can’t move the Network Security Group from one region to an another. You can only use the method of copy, paste and delete. Copy Security Rules using PowerShell I have created a PowerShell script to copy the security rules from one Network Security Group to another and also it has some other abilities like… Copy security rules from one Network Security Group to another. Creates a new Network Security Group and copy the security rules. Accepts the Network Security Group by name or as an object as well. By default the script merges the security rules, and it has an option to overwrite the existing NSG security rules. Code View the code in GitHub Syntax This function will accept the following Parameter Sets… Copy-AzNSGSecurityRules -SourceResourceGroupName <string> -SourceNSGName <string> -TargetResourceGroupName <string> -TargetNSGName <string> [<CommonParameters>] Copy-AzNSGSecurityRules -SourceResourceGroupName <string> -SourceNSGName <string> -TargetResourceGroupName <string> -TargetNSGName <string> -TargetLocation <string> [<CommonParameters>] Copy-AzNSGSecurityRules -SourceNSG <psobject> -TargetResourceGroupName <string> -TargetNSGName <string> -TargetLocation <string> [<CommonParameters>] Copy-AzNSGSecurityRules -SourceNSG <psobject> -TargetNSG <psobject> [-Overwrite] [<CommonParameters>] Example 01 To copy security rules from the existing source NSG to existing target NSG using NSG name… PS C:\> . .\Scripts\Copy-AzNSGSecurityRules.ps1 PS C:\> Copy-AzNSGSecurityRules -SourceResourceGroupName 'rg1' -SourceNSGName 'nsg1' -TargetResourceGroupName 'rg2' -TargetNSGName 'nsg2' Output: Following 2 security rule(s) is/are copied from source NSG 'rg1\nsg1' to target NSG 'rg2\nsg2' Deny_Internet, Allow_SqlServer Example 02 To create a new NSG and then copy security rules from the existing source NSG PS C:\> . .\Scripts\Copy-AzNSGSecurityRules.ps1 PS C:\> Copy-AzNSGSecurityRules -SourceResourceGroupName 'rg1' -SourceNSGName 'nsg1' -TargetNSGName 'nsg2' -TargetResourceGroupName 'rg2' -TargetLocation 'southindia' Output: New NSG 'nsg2' has been created in resource group 'rg2' in 'southindia' location. Following 2 security rule(s) is/are copied from source NSG 'rg1\nsg1' to target NSG 'rg2\nsg2' Deny_Internet, Allow_SqlServer If the target NSG is already existed… The NSG 'nsg2' is already existed, so vomiting the '-TagetLocation' parameter value and skiping the NSG creation. Following 2 security rule(s) is/are copied from source NSG 'rg1\nsg1' to target NSG 'rg2\nsg2' Deny_Internet, Allow_SqlServer Example 03 To copy security rules from the existing source NSG to existing target NSG (When direct NSG objects are provided) PS C:\> . .\Scripts\Copy-AzNSGSecurityRules.ps1 PS C:\> $nsg1 = Get-AzNetworkSecurityGroup -ResourceGroupName 'rg1' -Name 'nsg1' PS C:\> $nsg2 = Get-AzNetworkSecurityGroup -ResourceGroupName 'rg2' -Name 'nsg2' PS C:\> Copy-AzNSGSecurityRules -SourceNSG $nsg1 -TargetNSG $nsg2 Output: Following 2 security rule(s) is/are copied from source NSG 'rg1\nsg1' to target NSG 'rg2\nsg2' Deny_Internet, Allow_SqlServer Example 04 To create a new NSG and then copy security rules from the existing source NSG (When direct source NSG object is provided) PS C:\> . .\Scripts\Copy-AzNSGSecurityRules.ps1 PS C:\> $nsg1 = Get-AzNetworkSecurityGroup -ResourceGroupName 'rg1' -Name 'nsg1' PS C:\> Copy-AzNSGSecurityRules -SourceNSG $nsg1 -TargetNSGName 'nsg2' -TargetResourceGroupName 'rg2' -TargetLocation 'southindia' Output: New NSG 'nsg2' has been created in resource group 'rg2' in 'southindia' location. Following 2 security rule(s) is/are copied from source NSG 'rg1\nsg1' to target NSG 'rg2\nsg2' Deny_Internet, Allow_SqlServer If the target NSG is already existed… The NSG 'nsg2' is already existed, so vomiting the '-TagetLocation' parameter value and skiping the NSG creation. Following 2 security rule(s) is/are copied from source NSG 'rg1\nsg1' to target NSG 'rg2\nsg2' Deny_Internet, Allow_SqlServer Code View the code in GitHub
Recently, I have been requested to write a small PowerShell script to fetch all the Azure resources with no tags at all, but I thought of writing a comprehensive script that is not only to find all the resources with no tags but also to find resources with specific tag name(s), tag value(s), tag(s), or with all tags. Tagging in Azure I have already covered Tagging Microsoft Azure Resources Using Powershell (Az) in my previous post, but just to brief… Tags in Azure play pivotal role in managing the resources, predominantly in the cost governance strategies and much useful for automation and maintain environment hygiene. More than a resource name, tagging is very crucial and it must be consistent and appropriate across the resources in all the resource groups and subscriptions. Many organizations leverage the tagging effectively and consistently using the Azure policies or some automation techniques. Find-AzResource However, finding the resources in Azure is also crucial, and especially finding all the resources of all types from multiple subscriptions or resource groups. So I have come up with a PowerShell script to find all the Azure tagged/not tagged resources, and you can find the script in my GitHub repo… View the code in GitHub The script comes with an in-build help, and if you run the script without any parameters it will display the help as below… C:\Users\kiran\PSScripts> . .\Find-AzResource.ps1 C:\Users\kiran\PSScripts> Find-AzResource Output: NAME Find-AzResource SYNOPSIS Find-AzResource gets all the Azure tagged/not tagged resources, SYNTAX Find-AzResource [-ResourceName <String[]>] [-Location <String[]>] [-ResourceType <String[]>] [<CommonParameters>] Find-AzResource -ResourceGroupName <String[]> [-ResourceName <String[]>] [-Location <String[]>] [-ResourceType <String[]>] -Tag <Hashtable> [<CommonParameters>] Find-AzResource -ResourceGroupName <String[]> [-ResourceName <String[]>] [-Location <String[]>] [-ResourceType <String[]>] -TagValue <String[]> [<CommonParameters>] Find-AzResource -ResourceGroupName <String[]> [-ResourceName <String[]>] [-Location <String[]>] [-ResourceType <String[]>] -TagName <String[]> [<CommonParameters>] Find-AzResource -ResourceGroupName <String[]> [-ResourceName <String[]>] [-Location <String[]>] [-ResourceType <String[]>] -WithNoTag [<CommonParameters>] Find-AzResource -ResourceGroupName <String[]> [-ResourceName <String[]>] [-Location <String[]>] [-ResourceType <String[]>] -AllTagged [<CommonParameters>] Find-AzResource -SubscriptionName <String[]> [-ResourceName <String[]>] [-Location <String[]>] [-ResourceType <String[]>] -Tag <Hashtable> [<CommonParameters>] Find-AzResource -SubscriptionName <String[]> [-ResourceName <String[]>] [-Location <String[]>] [-ResourceType <String[]>] -TagValue <String[]> [<CommonParameters>] Find-AzResource -SubscriptionName <String[]> [-ResourceName <String[]>] [-Location <String[]>] [-ResourceType <String[]>] -TagName <String[]> [<CommonParameters>] Find-AzResource -SubscriptionName <String[]> [-ResourceName <String[]>] [-Location <String[]>] [-ResourceType <String[]>] -WithNoTag [<CommonParameters>] Find-AzResource -SubscriptionName <String[]> [-ResourceName <String[]>] [-Location <String[]>] [-ResourceType <String[]>] -AllTagged [<CommonParameters>] DESCRIPTION Find-AzResource gets all the Azure resources with... > All tags > No tags > Specific tag name(s) > Specific tag value(s) > Specific tag(s) ... from one or more resourcegroup(s) or subscripttion(s) and optionally filter the resources by location, name and type as well. ... output truncated ... -------------------------- EXAMPLE 1 -------------------------- PS > Find-AzResource Displays full help -------------------------- EXAMPLE 2 -------------------------- PS > Find-AzResource -SubscriptionName Sub1, Sub2 -AllTagged Finds all the resources with tags in the given Subscriptions. it even works with ResourceGroupName as well. Optionally, you can even filter the resources by Name, Location and Type. -------------------------- EXAMPLE 3 -------------------------- PS > Find-AzResource -SubscriptionName Sub1, Sub2 -WithNoTag Finds all the resources with no tags in the given Subscriptions. It even works with ResourceGroupName as well. Optionally, you can even filter the resources by Name, Location and Type. -------------------------- EXAMPLE 4 -------------------------- PS > Find-AzResource -ResourceGroupName RG1, RG2 -TagName Status Finds all the resources with given tag name in the given resource groups. It even works with the subscriptions as well. Optionally, you can even filter the resources by Name, Location and Type. -------------------------- EXAMPLE 5 -------------------------- PS > Find-AzResource -ResourceGroupName RG1, RG2 -TagValue HR, Finance Finds all the resources with given tag values in the given resource groups. It even works with the subscriptions as well. Optionally, you can even filter the resources by Name, Location and Type. -------------------------- EXAMPLE 6 -------------------------- PS > Find-AzResource -ResourceGroupName RG1, RG2 -Tag @{Dept='IT'; Status="Expired"} Finds all the resources with given tags in the given resource groups. It even works with the subscriptions as well. Optionally, you can even filter the resources by Name, Location and Type. View the code in GitHub
When we are working with an interactive shell, we use aliases for the frequently used cmdlets, and aliases are really useful and reduce the amount of keystrokes, and very simple and easy to use, and sometimes they can do magics as well. It is not recommended to use the aliases in the scripts and functions as part of the best practices. We can create aliases to cmdlets and functions, end even to scripts using the New-Alias cmdlet… New-Alias -Name hello -Value Hello-Word And, we can also create aliases to the function at the time of declaration itself using [Alias("")] statement, we need to place it just before the parameters block param. Parameters block is mandatory, at least an empty block param() Function Hello-World { [Alias('hello','hey')] param ( [string] $Name ) Write-Host "Hello, $Name!" }
Do you know that we can convert a predominant parameter value into a true command? Yes, to test the connectivity of a server we usually use Test-Connection -ComputerName ServerName, but how about using the ServerName -Ping? I know what you are thinking, what if there are hundreds or thousands of servers, do we need to create those many functions? No, just only one function with the help of $MyInvocation automatic variable. What is $MyInvocation? $MyInvocation is an automatic variable which contains information about the invocation details of the current execution, such as function or script name, parameters, parameter values, script root, script path, invocation name and etc., and $MyInvocation works with scripts, functions, and script blocks only. In action I will take the same ping example and show you that in action. I will write a small function to check the ping status with the name MyInvCmd and of course the name doesn’t matter and I will never use it or I can’t use it anywhere with that name, and here’s my function… Function MyInvCmd { $ServerName = $MyInvocation.InvocationName Test-Connection -TargetName $ServerName -IPv4 # In PowerShell 7 } Now, if I load this function and call it in the console, then it will throw me an error like this… MyInvCmd error But, as I said earlier I will never use this function with this name, so I will create an alias for this function with my server name. First to test this out in my local machine I will create an alias with localhost name… New-Alias -Name localhost -Value MyInvCmd That’s all, now load the MyInvCmd function and call with the alias name localhost in the console, and see the magic. You will see the ping results of the localhost as below… localhost example So, if you take a look at the function, $MyInvocation returns the output as below and the value of the InvocationName will be localhost, the actual function name is MyInvCmd though since it is being invoked by the alias name localhost, the value of the InvocationName is localhost and assigned it to $ServerName, and the same is used with -TargetName parameter in the Test-Connection cmdlet. MyCommand : MyInvCmd BoundParameters : {} UnboundArguments : {} ScriptLineNumber : 1 OffsetInLine : 1 HistoryId : 4 ScriptName : Line : localhost PositionMessage : At line:1 char:1 + localhost + ~~~~~~~~~ PSScriptRoot : PSCommandPath : InvocationName : localhost PipelineLength : 1 PipelinePosition : 1 ExpectingInput : False CommandOrigin : Runspace DisplayScriptPosition : So, whatever the alias name that you create for MyInvCmd will be used in the function as a server name, so what I will do is I will get the list of servers and create aliases with all of them. $FilePath = "C:\Inventory\Servers.txt" Get-Content -Path $FilePath | Foreach-Object { New-Alias -Name $_ -Value MyInvCmd } Now, all the server names are aliases to MyInvCmd but every alias invocation is unique and the function will work using the aliased server name. This time I will add more functionality and parameters as well, so that we can change the behavior on every execution. Besides ping functionality, I will add the functionality to get the server info, enter into PSSession, RDP session, and still you can continue to add… function MyInvCmd { <# .SYNOPSIS Replace MyInvCmd with the aliased server name #> [CmdLetBinding(DefaultParameterSetName = 'Help')] param ( [Parameter(Mandatory = $false)] [switch] $Ping, [Parameter(Mandatory = $false)] [switch] $ServerInfo, [Parameter(Mandatory = $false, ParameterSetName = 'PSSession')] [switch] $PSSession, [Parameter(Mandatory = $false, ParameterSetName = 'RDP')] [switch] $RDP ) $ServerName = $MyInvocation.InvocationName if ($Ping) { Test-Connection -TargetName $ServerName -IPv4 } # In PS7 if ($ServerInfo) { $OSInfo = Get-CimInstance -ClassName CIM_OperatingSystem -ComputerName $ServerName $CSInfo = Get-CimInstance -ClassName CIM_ComputerSystem -ComputerName $ServerName [PSCustomObject]@{ OperatingSystem = $OSInfo.Caption TotalMemory = $CSInfo.TotalPhysicalMemory FreeMemory = $OSInfo.FreePhysicalMemory NoOfCPUs = $CSInfo.NumberOfProcessors NoOfLogicalCPUs = $CSInfo.NumberOfLogicalProcessors LastRebootTime = $OSInfo.LastBootUpTime } } if ($PSSession) { Enter-PSSession -ComputerName $ServerName } if ($RDP) { Start-Process -FilePath mstsc.exe -ArgumentList /v:$ServerName } if ($MyInvocation.BoundParameters.Count -eq 0) { Get-Help -Name $ServerName } } And now, I will get my list of servers and add them as aliases to this function. Since I am running this demo in my lab which is running in Hyper-V, I will get all the VM Names and add them as aliases… Get-VM | ForEach-Object { New-Alias -Name $_.Name -Value MyInvCmd } And here’s the list of aliases that I have created for MyInvCmd, and if you notice all the server names are pointing to the MyInvCmd itself. aliases list Now, let’s see how to use these aliases, you can use the server name as command and pass the parameters to it. aliases in action What I like the most in this concept is I can use this pretty much for anything, not only the servers but also I can convert my request numbers, incidents and etc., and you can add lot more functionality at your convince.
Hosting on ☁️ Azure Static WebApps (Preview) Azure Static WebApps https://docs.microsoft.com/en-in/azure/static-web-apps/overview Microsoft announced Azure Static WebApps preview in the Build 2020 virtual event. Azure Static WebApps is a service from the Azure to host a web app directly from the source code repository with Azure managed CI/CD pipelines. As of now it is supporting only from GitHub and uses GitHub Actions Workflows for CI/CD and Azure Functions for backend support, the moment when you commit the code in the repository it builds the site and deploys to the associated Azure Static WebApp. At the time of I was writing this post Azure Static WebApps in still in public preview with a free plan offering. Image from Microsoft Docs Key Features Web Hosting - Hosts static content like HTML, CSS, JavaScript, and images. Azure Functions for backend support. Azure Managed CI/CD - Every commit to the code triggers build and deploy the site CDN Support - Globally distributed static content, putting content closer to your users. Free SSL certificates, which are automatically renewed. Supports custom domains to your website. Pull requests enabling preview versions of your site before publishing. Prerequisites Ready to deploy website Git Installed GitHub Account Azure Account with valid subscription. This post is in continuation to my previous post Create and Build A Static Website With Hugo, and if you don’t have a ready to deploy website and want to build a new site using Hugo then you can refer to that post to create a new static website with in a few minutes. If you have not installed the git already then you can go to Git website and install it. Push the site to GitHub repository Once you are done with site customization, added the content and built the site, you can push your site to GitHub Repository. You can go to site’s root directory in your favorite command line terminal and then run the following commands, in my case C:\Sites\kpatnayakuni is my site’s root directory. To initialize the local repository… C:\Sites\kpatnayakuni> git init To add the changes… C:\Sites\kpatnayakuni> git add . To commit the changes… C:\Sites\kpatnayakuni> git commit -m "Initial Commit" If you already have a GitHub account then login and create a repository with the same name as the local repository name, and in my case it is kpatnayakuni. If you don’t have a GitHub account then login into https://github.com/ and sign up with your email id and create the repository . First time if you are using the git then you need to set some global settings to work with the remote repositories… To set user name… C:\Sites\kpatnayakuni> git config --global user.name "<User Full Name>" To set email address… C:\Sites\kpatnayakuni> git config --global user.email "<User Email Address>" You need to use your name and email in the above examples. To add remote repository… C:\Sites\kpatnayakuni> git remote add origin https://github.com/<user_name>/<repository_name>.git To push the site onto GitHub C:\Sites\kpatnayakuni> git push -u origin master With our site is up in the GitHub repository, now let’s create a Static WebApp in the Azure Portal. If you don’t have an Azure account then you can login using your GitHub credentials, or create a new account then get your subscription. However, as of now Azure Static WebApps service comes with a free plan to start with. Login into the Azure Portal , then click on the search box and type the search word static then click on the Static Web Apps (Preview) service… In the Azure Portal When the Static Web Apps (Preview) blade is opened then click on Add button to create a new static web app… In the Azure Portal And as shown in the image below, please select the subscription and the resource group where you want to host your static website and then enter the app name and select the region near to your location, and then click on Sign in with GitHub to select the GitHub repository from where our static site is coming from. In the Azure Portal Once the GitHub login is successful, then please select the username/organizationname as organization, repository and branch then click next… In the Azure Portal If you have your only built site in the github repository then you can set root / as your App Location else if you have your complete site along with the build folder then you can set the build folder in the App Location, in my case it is /public. Api and Artifacts locations you can leave it for now, and if you want to add Azure Functions and configure artifacts then you can refer to the Microsoft Docs for the same. In the Azure Portal Click on Review + create and if the review is successful then click on Create to deploy the Static Web App. Once the deployment is successful then click on Go to resource to go to the Static Web App blade. In the Azure Portal Now, your site up and you will land up on the Static Web App (Preview) overview page with all the necessary info like URL, Workflow file and etc., and you can click on the url link to browse your site. In the Azure Portal Azure Static WebApp creates a random url for our site, but you can add a custom domain to your webapp. Azure Static WebApp allows you to create a staging environment by creating a pull request and when the pull request is merged then the staging environment will be moved to production environment. In the Azure Portal Recently, I have migrated my WordPress site to a static website using Hugo and currently this site is being hosted in the Azure Static WebApps, and till now I have not seen/experienced any issues with this preview hosting platform, however I have my backup plan ready to switch to GitHub Pages just in case of any hurdles.
Introduction Back then creating a personal website/blog was extremely painful; layout design, site development, managing the security, SSL certificates, owning a domain, hosting platform and of course the content were all alone on you and over your head, which was almost impossible if you were not from a web development background. And later, few of the website providers evolved and have been providing web hosting services but most of the controls are still with service provider itself and involves some additional cost to it for any greater customization. But now in the modern era of emerging cloud technology you don’t need to be a web developer any more and in fact not necessarily be a true techie guy, because except the content all other overheads like layout design and site development will be taken care of by some website generators and managing the security, SSL certificates, owning a domain and hosting platform will be handled by cloud technologies. So all you need is to concentrate on your content itself. However if you are from a technical background and not necessarily be from web development then you can build your own website/blog in the matter of minutes, and if you have a little exposure on the development then you could take the customization to a different level and generate a feel good website of your choice. What is a static website? A Static website is a collection of HTML webpages with fixed content, and every time the page loads it displays the same information to every visitor. The content remains unchanged after the site generated/built and doesn’t require any coding or backend (database) support, so that the pages can load so quickly and reliable, and change in the content requires to generate the whole site again. In simple, static websites are a basic type of websites and are easy to create. Static websites are used for… Company/personal branding Personal website or blog Project documentation or product manuals Advertisement or promotional offers of a product Presentation or status pages Newsletters & survey forms Landing page Now let’s see how to build a static website/blog using Hugo static site generator and available hosting options Build a static website with Hugo Hugo https://gohugo.io/ Hugo is one of the most popular open-source static website generators. It is a standalone tool and very easy to use, and builds the website extremely fast. Its a cross platform tool and flexible enough to host the website pretty much on any hosting platform. There is a huge collection of themes from the community to make up your site beautifully. Advantages No coding or HTML knowledge required Creates web pages using simple markdown syntax It renders the pages extremely fast Standard framework and ease of customization Inbuilt Google Analytics and Comment system Its free and huge community support Install Hugo I am using Windows operating system to install the Hugo and in case if you are using other operating system please refer to this link to follow the installation instructions. Before you install Hugo, ensure you have Git installed, it is recommended by Hugo. Hugo is written in Go language with cross platform support but you don’t need to install the Go to make Hugo to run, and it is a pre-compiled standalone binary. You can download the latest release of Hugo from the github repository under releases . If you need the Sass/SCSS support then you need to download the extended version of Hugo Since Hugo is a pre-compiled standalone binary, it doesn’t require any installation, you can directly grab the binary and set the binary path in the environment PATH variable. To set the environment PATH variable… setx path "%path%;c:\directoryPath" c:\directoryPath is the place where the Hugo binary file is extracted. Install using chocolaty package provider If you don’t have chocolaty package manager is already installed then follow the instructions here to install the chocolaty package manager in windows operating system. To install Hugo… choco install hugo -confirm Or if you need the extended version… choco install hugo-extended -confirm If you install the Hugo using the chocolaty package manager, then it will take care of setting the environment PATH variable. Verify installation To verify the installation… hugo version Create a new site Go to your feasible directory to create a new Hugo site and run the command below in your favorite command line terminal. Here I am using C:\Sites folder to create a new site and running command in PowerShell terminal. hugo new site kpatnayakuni kpatnayakuni is my new site name. Output: Congratulations! Your new Hugo site is created in C:\Sites\kpatnayakuni. Just a few more steps and you're ready to go: 1. Download a theme into the same-named folder. Choose a theme from https://themes.gohugo.io/ or create your own with the "hugo new theme <THEMENAME>" command. 2. Perhaps you want to add some content. You can add single files with "hugo new <SECTIONNAME>\<FILENAME>.<FORMAT>". 3. Start the built-in live server via "hugo server". Visit https://gohugo.io/ for quickstart guide and full documentation. Now I have create my new site named kpatnayakuni and it will create the site files for us… Directory: C:\Sites\kpatnayakuni Mode LastWriteTime Length Name ---- ------------- ------ ---- d---- 17-06-2020 05:47 AM archetypes d---- 17-06-2020 05:47 AM content d---- 17-06-2020 05:47 AM data d---- 17-06-2020 05:47 AM layouts d---- 17-06-2020 05:47 AM static d---- 17-06-2020 05:47 AM themes -a--- 17-06-2020 05:47 AM 82 config.toml Download a theme Now we have created our new site and let’s add a theme to it. You can find a whole bunch of themes in the Hugo themes page where you can go to the download page and see the download instructions. For now I will pick up hugo-refresh theme and add it to our new site. First, go to the site folder and then initialize the git repository PS C:\Sites> cd kpatnayakuni PS C:\Sites\kpatnayakuni> git init Add the theme to themes folder in the site PS C:\Sites\kpatnayakuni> git submodule add https://github.com/PippoRJ/hugo-refresh.git themes/hugo-refresh Using the desktop experience, you can directly download the theme and extract it to the site’s themes folder. And almost all the sites have example site inside the theme with a configuration file. You can copy the same file into your site and configure it as needed. First remove the site configuration file and then copy the example site configuration file from exampleSite folder inside the theme folder to our actual site PS C:\Sites\kpatnayakuni> rm config.* PS C:\Sites\kpatnayakuni> copy ./themes/hugo-refresh/exampleSite/config.* ./ The configuration file is self explanatory and you can modify the site configurations like Title, Description, Author and many more as per your requirement. You can also write your own configuration file from scratch or you can even further configure, please refer here for more details. Hugo can understand the configuration file in the toml, yaml or json format, and the configuration file should be under site’s root directory by default with the filename config and the extension could be any of your choice from .toml, .yaml or .json. Create a page Once you are done with adding the theme and basic site configuration, now we will add some content to the site. By default the content pages are added in the content folder under the site’s root directory. You should know the basic markdown syntax to create content of the site, and Hugo has that ability to convert the markdown files into html files. If you are new to markdown syntax then you can refer to https://www.markdownguide.org/ to get familiarize yourself, and it is very simple and easy to use, you can create or edit the markdown files in any text editor of your choice, and I am using Visual Studio Code for the same. To create a new page, run the command below in the site’s root directory… PS C:\Sites\kpatnayakuni> hugo new posts\welcome.md This creates the file welcome.md under <site’s root directory>\content\posts\welcome.md, i.e., in my case it is C:\Sites\kpatnayakuni\content\posts\welcome.md By default the page will be created with some front matter in it… --- title: "Welcome" date: 2020-06-18T05:09:42+05:30 draft: true --- The front matter is nothing but the attributes of the page, like title, date, author and etc., there are many other attributes that can change the behavior of the page, some are default from Hugo itself, and some are specific to the theme that you have selected, please refer to theme’s README file in the repository for more details, and also you can refer to the exampleSite under the theme’s folder in the site’s root directory for better understanding of organizing the content. Now, let’s add some content to welcome.md file… Open the file with your favorite text editor and add some content to it after the front matter… --- title: "Welcome" date: 2020-06-18T05:09:42+05:30 draft: true --- # This is a sample post This is my personal website and blog. Test the site in the localhost Hugo comes with an inbuilt webserver with live rebuild feature, which is really helpful during the development phase, you can instantly see the changes rendered in the website. To start the webserver… hugo server -D -D flag is used to include the draft pages as well, if you don’t want to include the draft pages then remove -D flag. You can manage the draft pages by setting the draft parameter to true/false in the front matter of the content page. Output: Building sites … | EN | RU -------------------+----+----- Pages | 5 | 3 Paginator pages | 0 | 0 Non-page files | 0 | 0 Static files | 28 | 28 Processed images | 1 | 0 Aliases | 1 | 0 Sitemaps | 0 | 0 Cleaned | 0 | 0 Built in 241 ms Watching for changes in C:\Users\kiran\AppData\Local\Temp\kpatnayakuni\{archetypes,content,data,layouts,static,themes} Watching for config changes in C:\Users\kiran\AppData\Local\Temp\kpatnayakuni\config.yaml Environment: "development" Serving pages from memory Running in Fast Render Mode. For full rebuilds on change: hugo server --disableFastRender Web Server is available at http://localhost:1313/ (bind address 127.0.0.1) Press Ctrl+C to stop By default Hugo brings up the webserver in the localhost with 1313 port number and if you see the webserver local address in the output that means your webserver is up and running in your localhost, then you can open your favorite browser and the open the site mentioned in the output (). Here’s my site with hugo-refresh theme… Home Page Posts List Post Note: I haven’t done any site customization yet, it is a default configuration from the theme, we will see more customization in the next section down below. Site configuration All the site related configuration is made available in the site’s root directory with the filename config.toml or config.yaml. You can open up the site configuration file with your favorite text editor and do configuration as needed. Now we will see few predominant settings will effect your site… My site configuration is in config.yaml, so I am using yaml syntax, if you are using the other format, then please look for the syntax baseURL: "https://kpatnayakuni.com" title: "Kiran Patnayakuni" theme: "hugo-refresh" It has inbuilt Google Analytics and comment system, only thing is that you need to create the required accounts and add those references here in the respective configuration section. For more customisation please refer to the configuration page in the hugo website. Almost all the configuration is self explanatory where you can enter the values accordingly Theme customization By default hugo themes already look great, but still you have many options to customize of your choice. First of all… Choose the theme suites to your purpose, so that the customization will be easy and mostly readily available for you. Before you do any theme customization, please visit the example site and look for the syntax in the respective markdown files, so that you can have a reference of various components used in the site and use them in you site as well. To up the example site, go to the exampleSite folder (<site-root-directory>\themes\<theme-name>\exampleSite) and run the command below… hugo server --themesDir ../.. Please refer to the theme components in the hugo website for better understanding of the site’s folder structure. You should have little knowledge on HTML & CSS to customize the site’s layout like how the home page looks like or list page or post page look like, and to change the behavior of the site you should be familiar with Java script as well. But it is not mandatory to build your site. Pick a theme which you really feel good, but it allows the customization to any extent. To create a rich content in your site you can make use of shortcodes, few are already available with the theme and few are Hugo inbuilt. What is a shortcode? Shortcodes are written in HTML to achieve the richness where markdown short falls. Pick a theme which has a good number of shortcodes, and also refer to shortcodes page in the hugo website for more builtin shortcodes. Build the site To build your ready to deploy site run the command below… hugo Use -D flag to include the draft pages as well. Hugo is the base command to build the hugo site and by default it builds the site under the public folder in the site’s root directory, that you can change it in the site configuration file. It is a complete site and is ready to deploy on any of your favorite hosting platform. Push the site to GitHub repository Once you are done with site customization, added the content and built the site, you can push your site to GitHub Repository. You can go to site’s root directory in your favorite command line terminal and then run the following commands… To initialize the local repository… C:\Sites\kpatnayakuni> git init To add the changes… C:\Sites\kpatnayakuni> git add . To commit the changes… C:\Sites\kpatnayakuni> git commit -m "Initial Commit" If you already have a GitHub account then login and create a repository with the same name as the local repository name, and in my case it is kpatnayakuni. If you don’t have a GitHub account then login into https://github.com/ and sign up with your email id and create the repository . First time if you are using the git then you need to set some global settings to work with the remote repositories… To set user name… C:\Sites\kpatnayakuni> git config --global user.name "<User Full Name>" To set email address… C:\Sites\kpatnayakuni> git config --global user.email "<User Email Address>" You need to use your name and email in the above examples. To add remote repository… C:\Sites\kpatnayakuni> git remote add origin https://github.com/<user_name>/<repository_name>.git To push the site onto GitHub C:\Sites\kpatnayakuni> git push -u origin master Publish the site Now, we are all done and we have our ready to deploy website on to any of our favorite hosting platform. Hugo builds the site which can be hosted pretty much on any hosting platform, and in fact Hugo has inbuilt feature to deploy on to popular cloud providers like Azure, AWS and GCP. You can refer to Hugo Deploy page in the hugo website, and refer to Hosting & Deployment page for other popular hosting solutions. Hosting on GitHub I personally suggest you to host your site on GitHub Pages, it is super easy and extremely fast. All it needs you should have a GitHub account and it is totally free. You can refer to the Hosting On GitHub page in the Hugo website to deploy your site in a step by step manner. I have recently moved my site to Azure Static WebApps from GitHub Pages, however always I have an option to switch back to GitHub Pages at any point of time in case of any an issue with the other hosting platform. Hosting on Azure Static WebApps (Preview) I have no reason to suggest you to host your static website on any other platform other GitHub Pages. In the interest of length of this post and exploring the Azure Static WebApps I will try to explain little about this preview service in my next blog post Publish a static website on Azure Static WebApps (Preview) in continuation to this post.
Azure Key Vault is to secure the secrets safely and access them securely as needed without hard-coding them in our code to authenticate to various applications on various environments, but the main challenge here is to authenticate to Key Vault, and if it is compromised then the entire secrets in the vault will be compromised so it should be handled properly. To overcome this and handle the secrets securely, Azure has come up with a concept called Managed Identities for Azure Resources as a feature in Azure Active Directory, which will help us to authenticate between the azure resources with a trusted relationship. There are two types of managed identities, one is System Assigned Managed Identity (SAMI) and the other is User Assigned Managed Identity (UAMI), however I am not going to discuss more about the Managed Identities here, you can refer to the Microsoft Docs. In this demo, I am going to enable User Assigned Managed Identity between Azure Virtual Machine and Azure Key Vault using PowerShell to access the Key Vault from the VM and retrieve the secrets with a trusted relationship. You can enable the User Assigned Managed Identity while creating the VM itself, but for our demo I am going to enable it after the VM is created, and the steps are as follows… Create a new Virtual Machine Create a Managed Identity Create a Key Vault Add a secret to Key Vault Grant Reader IAM role on Key Vault to Managed Identity Grant access policy to Managed Identity on Key Vault to get the secrets from Enable User Assigned Managed Identity on Virtual Machine Test the access on the Virtual Machine Create a new Virtual Machine For our demo I am going to create a simple windows VM with all default values using the New-AzVM CmdLet… # Create a Resource Group $rgName = 'Test-RG' $location = 'westus' $null = New-AzResourceGroup -Name $rgName -Location $location # Create a Virtual Machine $vmName = 'Test-VM' $userName = 'sysadmin' $plainTextPassword = 'P@ssw0rd!' $securePassword = $plainTextPassword | ConvertTo-SecureString -AsPlainText -Force $credential = [pscredential]::new($userName, $securePassword) $vm = New-AzVM -ResourceGroupName $rgName -Name $vmName ` -Location $location -Credential $credential The above code will create a brand new VM with all defaults and allow the RDP (3389) and PowerShell Remoting (5985) ports. If you want to create a fully configured VM of your choice then you can refer to the script on my GitHub Repo. Users with Contributor role can create & manage Virtual Machines. Create a Managed Identity Install and import the module Az.ManagedServiceIdentity to create managed identity using New-AzUserAssignedIdentityCmdLet, and this module is not part of Az PowerShell module. # Install and Import the module $moduleName = 'Az.ManagedServiceIdentity' Install-Module -Name $moduleName -Force Import-Module -Name $moduleName # Create User Assugned Managed Identity $identityName = 'amuai' $identity = New-AzUserAssignedIdentity -Name $identityName ` -ResourceGroupName $rgName -Location $location The above command will create a User Assigned Managed Identity named amuai. Create a Key Vault Create an Azure Key Vault to store secrets, which we will access it from the Virtual Machine using the Managed Identity… # Create Azure Key Vault $keyVaultName = 'testakv99' $keyVault = New-AzKeyVault -ResourceGroupName $rgName ` -Name $keyVaultName -Location $location The above command will create an Azure Key Vault named testakv99, and now we will add a secret to it. Add a secret to Key Vault For this demo, we will add the same password that was used to create our test VM. # Add a secret to Key Vault $null = Set-AzKeyVaultSecret -VaultName $keyVaultName ` -Name $userName -SecretValue $securePassword Users having the access to Key Vault will have the same access on all the secrets in the vault, so please add the secrets only required to the user. Grant Reader IAM role on Key Vault to Managed Identity For this demo, the managed identity is required to read the credentials from the Key Vault, so we need to grant the Reader role on the Key Vault to the Managed Identity. # Grant Reader role to Managed Identity on Key Vault $null = New-AzRoleAssignment -ApplicationId $identity.ClientId ` -RoleDefinitionName Reader -Scope $keyVault.ResourceId IAM role is limited to the Key Vault resource only and has no effect on accessing the secrets. You need to grant Key Vault access policy to get the secret. Grant access policy to Managed Identity on Key Vault to get the secrets from Since this is a demo and we only need to get the secret to be used in our code on the managed identity enabled VM, we will grant GETAccess Policy on Key Vault to managed identity. # Grant GET permissions to secrets on Key Key Vault to managed identity Set-AzKeyVaultAccessPolicy -ResourceGroupName $rgName -VaultName $keyVaultName ` -ServicePrincipalName $identity.ClientId -PermissionsToSecrets get Now we need to assign this identity and enable User Assigned Managed Identity on Virtual Machine. Enable User Assigned Managed Identity on Virtual Machine Since we have already created a VM and have that VM object in $vm, we will use the same variable for our demo or else you can get the existing VM using Get-AzVM CmdLet. # Assign the identity and enable User Assigned Managed Identity on Virtual Machine. $null = Update-AzVM -ResourceGroupName $rgName -VM $vm ` -IdentityType UserAssigned -IdentityID $identity.Id All done, now let’s test this on the Managed Identity enabled VM. Test the access on the Virtual Machine Now let’s test the access to Key Vault from the VM without an explicit authentication and get the secret from the Key Vault. Since I want to use the secret in the PowerShell code, I will test the access and retrieve the secret using the PowerShell itself, so I will install Az module on the VM for our testing purpose. If you are not using this secret in the PowerShell, you can use the REST methods to get the secrets from the Key Vault, please refer to the Microsoft Docs for the same. Get the public ip of the VM and connect to it… # Get the public ip of the new VM $vmPIP = Get-AzPublicIpAddress -ResourceGroupName $rgName -Name $vmName | % IpAddress Now let’s install the Az module and authenticate to Azure using the Identity flag and access the secret… ## On the NEW VM in which User Assigned Managed Identity enabled # Enter-PSSession -ComputerName $vmPIP -Credential $credential # Install NuGet package provider where Az module is available Install-PackageProvider -Name NuGet -Force # Install the Az module Install-Module -Name Az -Force # Login to Azure with managed identity Login-AzAccount -Identity # Get the secret from the Key Vault $kvName = 'testakv99' $keyName = 'sysadmin' $Secret = Get-AzKeyVaultSecret -VaultName $kvName -Name $keyName | % SecretValueText # You can use this secret in your code, since it is a demo I am writing it to screen Write-Host "The password is: $Secret" You can take a glance of the complete script from my GitHub Repo.
In PowerShell, we use tab to navigate through the available CmdLets, parameters, parameter values and files or folders in a given path, but to list all of them and select the required CmdLet, parameter, value, file or folder you need to press Ctrl + Space to list and then use arrow keys to select the required, it even works with wildcard characters. Try it now…
A traditional way of creating a PSCustomObject is as follows… $Emp = [pscustomobject] @{ Name = '' Department = '' Designation = '' } $Emp Output: Name Department Designation ---- ---------- ----------- Check the object type C:\> $Emp.GetType().Name Output: PSCustomObject And using Select-Object it is quite simple… $Emp = '' | Select-Object -Property Name, Department, Designation $Emp Output: Name Department Designation ---- ---------- ----------- Check the object type C:\> $Emp.GetType().Name Output: PSCustomObject
ValidateSet vs ArgumentCompleter ValidateSet parameter attribute is helpful to tab through the specified values of a particular parameter in a CmdLet or a Function and it only accepts the values specified in the validate set and it cannot generate the dynamic values by default, whereas the ArgumentCompleter parameter attribute is also similar to ValidateSet attribute but in addition to it, it has an in-built scriptblock to generate the dynamic values to pass the value to a parameter with tab completion and also it accepts the values other than the values available with tab completion. And both the attributes allow tab completion with wildcard matching values. We can create a separate Class to generate the dynamic values with the ValidateSet attribute. Using the ValidateSet, the values cannot be influenced by the other parameter values whereas using the ArgumentCompleter we can generate the dynamic values of a parameter based on the other parameter values. ArgumentCompleter produces the tab completion values as defined in the scriptblock with certain optional and positional parameters, and executes it each time when we hit the Tab, and of course it is the same with dynamic ValidateSet attribute. PowerShell requires the tab-completion values from the scriptblock in the form of an array to navigate through when we hit the Tab each time. Using ValidateSet Attribute The function below Get-Country will demonstrate how it works with the ValidateSet parameter attribute. Function Get-Country { [CmdLetBinding()] param ( [parameter(Mandatory = $true)] [ValidateSet('India', 'USA', 'UK', 'Canada', 'Australia')] [string] $CountryName ) ## Write your code here Write-Host $CountryName } In the command-line when you write Get-Country -CountryName <Tab>, it will navigate through the values defined in the ValidateSet, but you can’t enter the values other than in the ValidateSet. Using dynamic ValidateSet Attribute To generate the dynamic values to use with the ValidateSet attribute, you need to define the functionality in a class and then use it with the ValidateSet attribute. class SupportedCountries : System.Management.Automation.IValidateSetValuesGenerator { [string[]] GetValidValues() { ## Write your code here $Countries = @('India', 'USA', 'UK', 'Canada', 'Australia') return $Countries } } Function Get-Country { [CmdLetBinding()] param ( [parameter(Mandatory = $true)] [ValidateSet([SupportedCountries])] [string] $CountryName ) ## Write your code here Write-Host $CountryName } ArgumentCompleter script block As mentioned in the above, the scriptblock in the ArgumentCompleter has few optional and positional parameters that can be used if required, and now let’s see what are those parameters… $CommandName (Position 0) : This parameter is set to the name of the command for which the script block is providing tab completion. $ParameterName (Position 1) : This parameter is set to the parameter whose value requires tab completion. $WordToComplete (Position 2) : This parameter is set to value the user has provided before they pressed Tab. Your script block should use this value to determine tab-completion values. $CommandAst (Position 3) : This parameter is set to the Abstract Syntax Tree (AST) for the current input line. $FakeBoundParameters (Position 4) - This parameter is set to a hashtable containing the $PSBoundParameters for the cmdlet before the user pressed Tab. The parameter names can be anything but should be in the same position. Using ArgumentCompleter Attribute The Get-Country function below will accept the values specified in the scriptblock through tab completion and also allows other values by passing manually… Function Get-Country { [CmdLetBinding()] param ( [parameter(Mandatory = $true)] [ArgumentCompleter( { param ( $CommandName, $ParameterName, $WordToComplete, $CommandAst, $FakeBoundParameters ) # Write your code here return @('India', 'USA', 'UK', 'Canada', 'Australia') })] [string] $CountryName ) Write-Host $CountryName } In the above example when you type Get-Country -CountryName <tab> it will navigate through the country names specified in the scriptblock and also accepts other country names as well. If you use the wildcard it will navigate through the matching values only and of course it is also applicable to the ValidateSet attribute as well. Now, let’s see another example that will dynamically generate the tab completion values based on the other parameter value… Function Get-Country { [CmdLetBinding()] param ( [parameter(Mandatory = $false)] [ValidateSet('''North America''', 'Europe', 'Asia', 'Oceania')] [string] $Continent, [parameter(Mandatory = $true)] [ArgumentCompleter( { param ( $CommandName, $ParameterName, $WordToComplete, $CommandAst, $FakeBoundParameters ) $CountriesByContinent = @{ 'North America' = @('USA', 'Canada') Europe = @('UK', 'Germany') Asia = @('India', '''Sri Lanka''') Oceania = @('''New Zealand''', 'Australia') } if ($fakeBoundParameters.ContainsKey('Continent')) { $CountriesByContinent[$fakeBoundParameters.Continent] | Where-Object { $_ -like "$wordToComplete*" } } else { $CountriesByContinent.Values | ForEach-Object { $_ } } })] [string] $CountryName ) Write-Host $CountryName } In the above example, I have not used all the parameters though, but the values of the parameters will be as follows… $CommandName is Get-Country $ParameterName is CountryName $WordToComplete, here in this example it is empty but if you use Get-Country -CountryName Ger<Tab> then the value of this parameter is Ger $CommandAst is Get-Country -CountryName $FakeBoundParameters, if you run Get-Country -Continent Asia -CountryName India then the value of this parameter is @{Continent = ‘Asia’} which is a hashtable, if there are any other bound parameters then they will also be returned in the form of hashtable itself. Using Register-ArgumentCompleter CmdLet To give an ability to tab through the valid values of a parameter in a CmdLet or a Function, you can use the Register-ArgumentCompleter CmdLet to register the valid argument completers to navigate through between the Tab hits. The Register-ArgumentCompleter will also accept the script block as it is in the ArgumentCompleter parameter attribute with all the optional and positional parameters to register the argument completers of a parameter in a CmdLet or a Function. Now let’s see how it works… Function Get-Country { [CmdLetBinding()] param ( [parameter(Mandatory = $false)] [ValidateSet('''North America''', 'Europe', 'Asia', 'Oceania')] [string] $Continent, [parameter(Mandatory = $true)] [string] $CountryName ) ## Write your code here Write-Host $CountryName } In the above function, there is no ValidateSet or ArgumentCompleter parameter attributes to CountryName parameter, and now we will register the argument completers using Register-ArgumentCompleter CmdLet to work with tab completion, and we will use the same script that we have used in the above example. $ScriptBlock = [scriptblock]::Create({ param ( $CommandName, $ParameterName, $WordToComplete, $CommandAst, $FakeBoundParameters ) $CountriesByContinent = @{ 'North America' = @('USA', 'Canada') Europe = @('UK', 'Germany') Asia = @('India', '''Sri Lanka''') Oceania = @('''New Zealand''', 'Australia') } if ($fakeBoundParameters.ContainsKey('Continent')) { $CountriesByContinent[$fakeBoundParameters.Continent] | Where-Object { $_ -like "$WordToComplete*" } } else { $CountriesByContinent.Values | ForEach-Object { $_ } } }) Register-ArgumentCompleter -CommandName Get-Country -ParameterName CountryName -ScriptBlock $ScriptBlock You can also register the argument completers to any CmdLets or Functions from any vender and to any native applications as well. $ScriptBlock = [scriptblock]::Create({ param ( $CommandName, $ParameterName, $WordToComplete, $CommandAst, $FakeBoundParameters ) $Shares = Get-SmbShare | Where-Object {$_.Name -like "$WordToComplete*"} $Shares | ForEach-Object { New-Object -Type System.Management.Automation.CompletionResult -ArgumentList $_.Name, $_.Name, "ParameterValue", $_.Name } }) Register-ArgumentCompleter -CommandName Set-SmbShare -ParameterName Name -ScriptBlock $ScriptBlock
You can get the alias by name using Get-Alias with -Name parameter… Get-Alias -Name gci To get all the aliases of a CmdLet, use Get-Alias with -Definition parameter… Get-Alias -Definition Get-ChildItem
Converting a string to upper and lower cases are possible with the string object using the dot notation, but not sure why title case is not directly supported yet with the string object in PowerShell, however there are a couple of ways to achieve the same… $Text = 'kiran patnayakuni' # Using PowerShell (Get-Culture).TextInfo.ToTitleCase($Text) #Using .Net [cultureinfo]::CurrentCulture.TextInfo.ToTitleCase($Text)
To refer the overload definitions/syntax of a method of an object using the OverloadDefinitions property of a method with simple dot notation, or simply use the method name without the parenthesis (). <object>.<method>.OverloadDefinitions # or <object>.<method> Example: $Text = "PowerShell" # To see all the methods of an object $Text | Get-Member -MemberType *Method $Text.LastIndexOf # or $Text.LastIndexOf.OverloadDefinitions
Wishing you all… Happy New Year - 2020 The year 2019 was extremely good for me and helped me to cross a few milestones in my career and in my personal life as well. Thank you very much everyone who supported and helped me in my growth! I have learned a bit and there is a lot to learn. Undoubtedly, the year 2019 was very successful for me, and I hope it is the same with you all. I have set my goals for the year 2020 and will work hard to achieve all of them, and I hope it will be one of the best years of my life. Once again, I wish you all a very happy and successful new year 2020. I have started my celebrations by writing a small snippet to wish you a happy new year in the console using PowerShell… It can be directly invoked from my GitHub repo… iex (irm https://bit.ly/2G19w4s); Wish-HappyNewYear
As we all know, all the PowerShell folks use the follow three CmdLets irrespective of their expertise… Get-Command Get-Help Get-Member But still, they should be properly used to get what exactly you want, otherwise you will definitely mess up on your screen. Just to stress out what I mean to say, let me give you some examples… Just to see the syntax of a command you don’t need to use Get-Help or Get-Help with -Detailed or -Full, you can simply use the Get-Command with -Syntax parameter… Get-Command -Name Get-ComputerInfo -Syntax To see the help of a particular parameter you don’t need to use Get-Help with -Detailed or -Full, you can simply use it with the -Parameter… Get-Help -Name Get-SmbShare -Parameter ScopeName One last example, to see only the properties of an object you don’t need to simply use Get-Member alone, you can use it along with -MemberType parameter… Get-Date | Get-Member -MemberType Properties That’s all with the examples, now let’s see various use cases from the above three CmdLets… Listing/Finding the commands using Get-Command # To list all the commands availble in the current session Get-Command <# Search the commands with the name using wildcard, and it returns all types of commands matching the name with the given pattern, you can also filter further down the commands using -Module, -CommandType or both #> Get-Command -Name *VHD* Get-Command -Name *VHD* -CommandType Function Get-Command -Name *VHD* -Module Hyper-V Get-Command -Name *VHD* -CommandType Cmdlet -Module Hyper-V <# Search the commands with Verb or Noun, or both using direct names or wildcards as well, also with -Module as well #> Get-Command -Verb Get Get-Command -Noun SmbShare Get-Command -Verb Set -Noun Smb* Get-Command -Verb Get -Noun Smb* -Module SmbShare <# List the commands in a module, again you can filter further down using -Name or -Verb & -Noun, and -CommandType #> Get-Command -Module SqlServer Get-Command -Module SqlServer -CommandType Alias <# Search commands by either the parameter name or parameter type, or both You can still filter further down using -Name, or -Verb & -Noun, -Module and -CommandType #> Get-Command -ParameterName VMName Get-Command -ParameterType System.Boolean Get-Command -ParameterName Scoped -ParameterType System.Boolean <# List the commands from the modules loaded in the current session You can still filter further down using -Name, or -Verb & -Noun, -Module -CommandType, -ParameterName and -ParameterType #> Get-Command -ListImported <# To limit the output count. You can use with all ParameterSets Works with all possible parameter combination #> Get-Command -TotalCount 10 Get-Command -ListImported -TotalCount 10 <# Search the commands from closest match to least likely match. You can use this switch if you are not sure about exact name of the command, and you can't use this with wildcard search, and works only with -Name parameter combination #> Get-Command -Name gotcemmand -UseFuzzyMatching <# List the commands with the same name from different sources To test this, create an empty Write-Error function and then run the command below To create an empty function: Function Write-Error {} #> Get-Command -Name Write-Error -All # Or to list all Get-Command -All <# List the commands using the FullyQualifiedModule parameter to list the commands from the specific version of the module -FullyQualifiedModule and -Module are mutually exclusive #> Get-Command -FullyQualifiedModule @{ModuleName = "UEV"; ModuleVersion = "2.1.639.0" } <# Get the command count Get-Command can be used with all possible parameter combinations #> Get-Command | Measure-Object -Line | Select-Object -ExpandProperty Lines <# List commands return output, and its output type Get-Command can be used with all possible parameter combinations #> Get-Command | Where-Object OutputType | Format-List Get information about a specific CmdLet using Get-Command # Get the command basic info Get-Command -Name Get-ComputeProcess # Get the syntax(s) of a given command Get-Command -Name Get-Counter -Syntax # Get complete information about the given command Get-Command -Name New-NetIPAddress | Format-List * # Get all the parameters of a given command Get-Command -Name Get-ControlPanelItem | ` Select-Object -ExpandProperty Parameters # Or (Get-Command -Name Get-ControlPanelItem).Parameters # Get the module name of a given command Get-Command -Name Show-ControlPanelItem | ` Select-Object -ExpandProperty ModuleName # Or (Get-Command -Name Show-ControlPanelItem).ModuleName # Get the definition of a given command # For CmdLets you see only systax, it works only for Functions Get-Command -Name Get-NetAdapterStatistics | ` Select-Object -ExpandProperty Definition # Or (Get-Command -Name Get-NetAdapterStatistics).Definition # Get the command output type Get-Command -Name Get-NetAdapterHardwareInfo | ` Select-Object -ExpandProperty OutputType # Or (Get-Command -Name Get-NetAdapterHardwareInfo).OutputType # Get the command's default parameter set Get-Command -Name Get-Disk | ` Select-Object -ExpandProperty DefaultParameterSet # Or (Get-Command -Name Get-Disk).DefaultParameterSet # Get the type of a given command Get-Command -Name Get-NetRoute | Select-Object -ExpandProperty CommandType # Or (Get-Command -Name Get-NetRoute).CommandType # Get the dynamic parameter list of a given command (if any) Get-Command -Name Get-Package -ArgumentList 'Cert:' | ` Select-Object -ExpandProperty ParameterSets | ` ForEach-Object {$_.Parameters} | ` Where-Object {$_.IsDynamic} | ` Select-Object -Property Name -Unique You can also try Show-Command instead of Get-Command Get Help of a specific CmdLet or about topic using Get-Help # To get the basic help Get-Help -Name Get-WULastInstallationDate # To get the parameter description & examples in-addition to the basic help Get-Help -Name Test-WSMan -Detailed # To get the comprehensive help includes parameter descriptions and attributes, # examples, input and output object types, and additional notes. Get-Help -Name Invoke-Expression -Full # To get the help with examples only Get-Help -Name New-LocalGroup -Examples # To get the online help in a browser seperately Get-Help -Name Test-Connection -Online # To get the full help in a seperate window Get-Help -Name Get-Process -ShowWindow # To get the help of a specific parameter of a command Get-Help -Name Get-NetConnectionProfile -Parameter InterfaceIndex # To get the help of all the parameters of a command Get-Help -Name Compress-Archive -Parameter * # You can use alias name as well, and works with all the above parameter combination Get-Help -Name ls # To get the help of a script (if available), and works with all the above parameter combination Get-Help -Name C:\GitRepo\Test-Script.ps1 # To list the available help matching with a specific word Get-Help -Name netconnection # To list all the conceptual topics Get-Help -Name about_* # To get the help of a specific conceptual topic Get-Help -Name about_ForEach-Parallel Get members of an object using Get-Member # Get all the member of an output object of Get-StartApps Get-StartApps | Get-Member # Get all the members and the intrinsic members and compiler-generated # members of the objects to the display, but by default they are hidden. $FirewallRules = Get-NetFirewallRule -Name FPS-* $FirewallRules | Get-Member -Force $FirewallRules.psbase # Get all the extended members of an InputObject, works with pipeline as well $VMProcessors = Get-VMProcessor -VMName Lab-ClientX Get-Member -InputObject $VMProcessors -View Extended # Get all the details of a member by name Get-NetTCPConnection | Get-Member -Name State | Format-List * ## Get the members by type Get-NetAdapter | Get-Member -MemberType Properties # All types of properties Get-NetAdapter | Get-Member -MemberType ScriptProperty # ScriptProperties Only Get-NetAdapter | Get-Member -MemberType Methods # All methods Get-NetAdapter | Get-Member -MemberType Method # Only method type
String Operations in PowerShell Example string C:\> $varString = 'This is an example string for PowerShell string operations' String assignment C:\> $otherString = $varString.Clone() # OR C:\> $otherString = $varString C:\> $otherString Output: This is an example string for PowerShell string operations Change the string to upper case C:\> $varString.ToUpper() Output: THIS IS AN EXAMPLESTRING FOR POWERSHELLDTRING OPERATIONS Change the string to lower case C:\> $varString.ToLower() Output: this is an example string for powershell string operations Change the string to title case C:\> (Get-Culture).TextInfo.ToTitleCase($varString) Output: This Is An Example String For Powershell String Operations Compare with other string C:\> $otherString = 'an example string' C:\> $varString.CompareTo($otherString) # 1 - Partial match C:\> $otherString = 'This is an example string for PowerShell string operations' C:\> $varString.CompareTo($otherString) # 0 - Exact match C:\> $otherString = 'This is an example string for X PowerShell string operations' C:\> $varString.CompareTo($otherString) # -1 - More than matching Output: 1 0 -1 Verify that the given string is an exact match or not C:\> $otherString = 'This is an example string for PowerShell string operations' C:\> $varString.Equals($otherString) # It returns true # It says only true or false, if it is exact match then it says true if not false Output: True Verify that, this string contains the given string or not C:\> $matchString = 'PowerShell' C:\> $varString.Contains($matchString) # Returns True # It's a case sensitive when matching with the other string C:\> $matchString = 'powershell' C:\> $varString.Contains($matchString) # Returns False # Using 'Select-String' CmdLet C:\> $matchString = 'powershell' C:\> Select-String -InputObject $varString -Pattern $matchString -Quiet # Returns True C:\> Select-String -InputObject $varString -Pattern $matchString -CaseSensitive -Quiet # Returns False/null Output: True False True Verify that the given string is matching with starting of the string C:\> $matchString = 'This is' C:\> $varString.StartsWith($matchString) # Returns True # It's a case sensitive for all match cases C:\> $matchString = 'this' C:\> $varString.StartsWith($matchString) # Returns False Output: True False Verify that the given string is matching with ending of the string C:\> $otherString = 'operations' C:\> $varString.EndsWith($otherString) # Return True # Again this is also case sensitive Output: True Add leading spaces/characters C:\> $str = 'Hello' C:\> $str.PadLeft(10,'#') # If the character is not specified it will take space by default Output: #####Hello Add trailing spaces/characters C:\> $str = 'Hello' C:\> $str.PadRight(10,'#') # If the character is not specified it will take space by default Output: Hello##### Get length of the given string C:\> $varString.Length Output: 58 Find the starting index of a given character in a string C:\> $char = 'l' C:\> $varString.IndexOf($char) # You can set starting index C:\> $varString.IndexOf($char,17) # Ignoring the case C:\> $char = 'L' C:\> $varString.IndexOf($char,[System.StringComparison]::CurrentCultureIgnoreCase) # All the above examples work same with 'string' as well Output: 16 38 16 Find the starting index of any given characters in a string C:\> $arrayChar = @('a','e','i','o','u') C:\> $varString.IndexOfAny($arrayChar) # You can also set starting index C:\> $varString.IndexOfAny($arrayChar,3) Output: 2 5 Find the last index of a given character in a string C:\> $char = 'e' C:\> $varString.LastIndexOf($char) # You can set starting index C:\> $varString.LastIndexOf($char,40) # Ignoring the case C:\> $char = 'E' C:\> $varString.LastIndexOf($char,[System.StringComparison]::CurrentCultureIgnoreCase) # All the above examples work same with 'string' as well Output: 50 37 50 Find the last index of any given characters in a string C:\> $arrayChar = @('a','e','i','o','u') C:\> $varString.LastIndexOfAny($arrayChar) # You can also set starting index C:\> $varString.LastIndexOfAny($arrayChar,40) Output: 55 37 Find indexes of all the occurrences C:\> (Select-String -InputObject $varString -Pattern 'l' -AllMatches).Matches.Index Output: 16 38 39 Split a string seperated by a space C:\> $varString.Split() Output: This is an example string for PowerShell string operations Split a string by a given character C:\> $varStr = 'First Name: Kiran,Last Name: Patnayakuni,City: Bangalore,Course: PowerShell' C:\> $seperator = ',' C:\> $varStr.Split($seperator) Output: First Name: Kiran Last Name: Patnayakuni City: Bangalore Course: PowerShell Split a string with implicit conversion C:\> (Get-Date).Split() # Throws an error C:\> (Get-Date) -split ' ' # Instead of space you can give any character or even a string as well Output: - InvalidOperation: Method invocation failed because [System.DateTime] does not contain a method named 'Split'. 12/11/2019 18:27:45 Join two or more strings C:\> -join ('Well', 'come') # You can add any numbers of strings seperated comma C:\> [string]::Concat('Honey','well') # You can add any numbers of strings seperated comma C:\> -join ('Good', ' ', 'Morning') C:\> 'Hello', 'World' -join ' ' # You can add any numbers of strings seperated comma, and the seperator can be any character or a string as well Output: Wellcome Honeywell Good Morning Hello World Convert other data types to string data type C:\> Get-Date # Return type datetime C:\> (Get-Date).ToString() # Return type string, and you can set the format inside the parenthesis. You can convert to string datatype from anyother datatype possible C:\> Get-Date | Out-String # Return type string Output 11 December 2019 19:03:02 11-12-2019 19:03:15 11 December 2019 19:03:34 Trim the leading & trailing spaces/characters of a given string C:\> $strWithSpaces = ' Hello World ' C:\> Write-Host "This statement '$strWithSpaces' has some leading and trailing spaces" # Trim the spaces C:\> $strWithSpaces = $strWithSpaces.Trim() C:\> Write-Host "This statement '$strWithSpaces' has no leading or trailing spaces" C:\> $strWithExtChars = '~~~~~~~~Hello World~~~~~~~~' C:\> $strWithExtChars # Trim the extra characters C:\> $strWithExtChars = $strWithExtChars.Trim('~') C:\> $strWithExtChars # Trim the strings as well C:\> $FileName = "xxxpowershell.exexxx" C:\> $FileName.Trim("xxx") Output: This statement ' Hello World ' has some leading and trailing spaces This statement 'Hello World' has no leading or trailing spaces ~~~~~~~~Hello World~~~~~~~~ Hello World powershell.exe Trim the leading spaces/characters of a given string C:\> $strWithLeadingSpaces = ' Hello World' C:\> Write-Host "This statement '$strWithLeadingSpaces' has some leading spaces" # Trim the leading spaces C:\> $strWithLeadingSpaces = $strWithLeadingSpaces.TrimStart() C:\> Write-Host "This statement '$strWithLeadingSpaces' has no leading spaces" C:\> $strWithExtChars = '~~~~~~~~Hello World~~~~~~~~' C:\> $strWithExtChars # Trim the extra leading characters C:\> $strWithExtChars = $strWithExtChars.TrimStart('~') C:\> $strWithExtChars # Trim the strings as well C:\> $FileName = "powershell.exe" C:\> $FileName.TrimStart("power") Output: This statement ' Hello World' has some leading spaces This statement 'Hello World' has no leading spaces ~~~~~~~~Hello World~~~~~~~~ Hello World~~~~~~~~ shell.exe Trim the trailing spaces/characters of a given string C:\> $strWithTrailingSpaces = 'Hello World ' C:\> Write-Host "This statement '$strWithTrailingSpaces' has some trailing spaces" # Trim the spaces C:\> $strWithTrailingSpaces = $strWithTrailingSpaces.TrimEnd() C:\> Write-Host "This statement '$strWithTrailingSpaces' has no trailing spaces" C:\> $strWithExtChars = '~~~~~~~~Hello World~~~~~~~~' C:\> $strWithExtChars # Trim the extra trailing characters C:\> $strWithExtChars = $strWithExtChars.TrimEnd('~') C:\> $strWithExtChars # Trim the strings as well C:\> $FileName = "powershell.exe" C:\> $FileName.TrimEnd(".exe") Output: This statement 'Hello World ' has some leading and trailing spaces This statement 'Hello World' has no leading or trailing spaces ~~~~~~~~Hello World~~~~~~~~ ~~~~~~~~Hello World powershell Generate a Random string C:\> $randomStrLength = 16 # With all lower, upper, numeric and including all special characters C:\> -join ((33..126) | Get-Random -Count $randomStrLength | % {[char]$_}) # With all lower and numeric characters only C:\> -join ((97..122) + (48..57) | Get-Random -Count $randomStrLength | %{[char]$_}) # With all upper and numeric characters only C:\> -join ((65..90) + (48..57) | Get-Random -Count $randomStrLength | %{[char]$_}) # With all lower, numeric and some special characters C:\> -join ((97..122) + (33..64) | Get-Random -Count $randomStrLength | %{[char]$_}) # In your case outputs will be different Output: -d!Z~k8r3,%JGbKC cep976xaz2mqtwgr 9T061UFDL3BIE82A au<y:>;s#86(mf.5 Remove specified number of characters from a given string # Remove all the characters starting from 18th index C:\> $varString.Remove(18) # Remove a specified number of characters C:\> $varString.Remove(18,22) Output: This is an example This is an example string operations Insert a specified string in a given string C:\> $myString = 'Good Kiran' C:\> $InsString = ' Morning' C:\> $myString.Insert(4, $InsString) Output: Good Morning Kiran Replace a specified string in a given string C:\> $myString = 'My name is kiran patnayakuni' C:\> $myString.Replace(' ', ',') C:\> $myString.Replace('p','P') C:\> $myString.Replace('kiran','Kirankumar') # Replace with implicit conversion C:\> (Get-Date).Date -replace '-', '' Output: My,name,is,kiran,patnayakuni My name is kiran Patnayakuni My name is Kirankumar Patnayakuni 12122019 00:00:00 Get a Substring from a given string C:\> $myString = 'WindowsPowerShell' C:\> $myString.Substring(7) # Till the end C:\> $myString.Substring(7,5) # Length of 5 Output: PowerShell Power Place holders C:\> $FirstName = 'Kiran' C:\> $LastName = 'Patnayakuni' C:\> $DOJ = '30-08-2015' C:\> $Organization = 'ABC Companey' C:\> "Hello - {0}, {1} has joined in the organization {2} on {3}." -f $FirstName, $LastName, $Organization, $DOJ Output: Hello - Kiran, Patnayakuni has joined in the organization ABC Companey on 30-08-2015. Verify the string is null or empty $myString = $null [string]::IsNullOrEmpty($myString) # Returns True $myString = '' [string]::IsNullOrEmpty($myString) # Returns True $myString = ' ' [string]::IsNullOrEmpty($myString) # Returns False ### And in all other cases it returns False Output: True True False Verify the string is null or white space $myString = $null [string]::IsNullOrWhiteSpace($myString) # Returns True $myString = '' [string]::IsNullOrWhiteSpace($myString) # Returns True $myString = ' ' [string]::IsNullOrWhiteSpace($myString) # Returns True ### And in all other cases it returns False Output: True True True
About Terraform Terraform is an Infrastructure as Code (IaC) tool from HashiCorp, and it is an open-source, cross-platform and multi-cloud infrastructure deployment tool. It uses HashiCorp Configuration Language (HCL) and supports all popular cloud service providers. Terraform Installation Terraform comes as a single binary in a zip archive, you need to download it from the terraform official download page , extract the archive and you can use it without having it installed and you can also use chocolate package provider to download the terraform. To use the binary globally you need to set the binary location to the PATH. Alternatively, the PowerShell script below does all these steps for you and make it ready to run your terraform deployments… Function Install-Terraform { # Ensure to run the function with administrator privilege if (-not (New-Object Security.Principal.WindowsPrincipal([Security.Principal.WindowsIdentity]::GetCurrent())).IsInRole([Security.Principal.WindowsBuiltInRole]::Administrator)) { Write-Host -ForegroundColor Red -Object "!!! Please run as Administrator !!!"; return } # Terrafrom download Url $Url = 'https://www.terraform.io/downloads.html' # Local path to download the terraform zip file $DownloadPath = 'C:\Terraform\' # Reg Key to set the persistent PATH $RegPathKey = 'HKLM:\System\CurrentControlSet\Control\Session Manager\Environment' # Create the local folder if it doesn't exist if ((Test-Path -Path $DownloadPath) -eq $false) { $null = New-Item -Path $DownloadPath -ItemType Directory -Force } # Download the Terraform exe in zip format $Web = Invoke-WebRequest -Uri $Url $FileInfo = $Web.Links | Where-Object href -match windows_amd64 $DownloadLink = $FileInfo.href $FileName = Split-Path -Path $DownloadLink -Leaf $DownloadFile = [string]::Concat( $DownloadPath, $FileName ) Invoke-RestMethod -Method Get -Uri $DownloadLink -OutFile $DownloadFile # Extract & delete the zip file Expand-Archive -Path $DownloadFile -DestinationPath $DownloadPath -Force Remove-Item -Path $DownloadFile -Force # Setting the persistent path in the registry if it is not set already if ($DownloadPath -notin $($ENV:Path -split ';')) { $PathString = (Get-ItemProperty -Path $RegPathKey -Name PATH).Path $PathString += ";$DownloadPath" Set-ItemProperty -Path $RegPathKey -Name PATH -Value $PathString # Setting the path for the current session $ENV:Path += ";$DownloadPath" } # Verify the download Invoke-Expression -Command "terraform version" } You can run this function any number of times, if terraform is not configured it will download and configure the path, if it is already there it will replace/upgrade with the latest version. You can visit terrafrom docs to start with terraform. To start with Azure deployments using terraform you can visit Microsoft Docs.
When a variable is assigned with a value, it won’t print it on the console by default, you need to print it separately. C:\> $varNumber = 10 C:\> $varNumber And variable squeezing is nothing but while assigning the value to a variable, it will be printed on the console as well, for that you need to wrap the assignment with the parenthesis () C:\> ($varNumber = 10) 10 Let’s take another example… C:\> $varString = ('This is an example of variable squeezing').Split(' ') C:\> $varString This is an example of variable squeezing C:\> $varString[2..$($varString.Length - 1)] -join ' ' an example of variable squeezing In the above example, you can use the object properties only after it is assigned, but using the variable squeezing while assigning to the variable itself you can call the properties as well… C:\> ($varString = ('This is an example of variable squeezing').Split(' '))[2..$($varString.Length - 1)] -join ' ' an example of variable squeezing One last example if ($null -ne ($HostInfo = Get-Host)) { $HostInfo.Version } In the if statement itself you can assign the variable and in the script block you can directly use it. This way you can cut down a few lines of code in your scripts.
Everything is an object in PowerShell, and every object has it’s own properties, methods and some objects have events as well. Especially when it is working with the object properties, we can’t run through each property by its name & value in a loop, so to work with the individual properties we can convert the object properties into a HashTable and with the .GetEnumerator() and then you can manage the properties individually. The script below to convert the object properties into HashTable… # Converts Object properties to HashTable. Function Convert-ObjectToHashTable { [CmdletBinding()] param ( [parameter(Mandatory=$true,ValueFromPipeline=$true)] [pscustomobject] $Object ) $HashTable = @{} $ObjectMembers = Get-Member -InputObject $Object -MemberType *Property foreach ($Member in $ObjectMembers) { $HashTable.$($Member.Name) = $Object.$($Member.Name) } return $HashTable } Example output:
Do you want know when was your Azure VM created? Then this script will retrieve date time created for the given Azure VM(s). Not sure whether there is direct way to retrieve the VM created date, but I couldn’t find any CmdLet to know the VM created date. However, I have written a PowerShell function to know the VM created date by considering the VM OS disk creation date as VM creation date. The function accepts the combination of Resource Group Name and VM Name as mandatory parameters or VM object(s), and you will see the output as below… And even I have tried the same using Azure Cli as well, and you get the output like this… And, here’s the script… PowerShell AZCli <# This script pulls the date and the time on which the Azure VM(s) created. This script accepts Resource Group & VM Name as mandatory parameters and accepts VM object(s) optionally. Since there is no direct Cmdlet to fetch the create date, it is considered the disk create date as VM create date. #> Function Get-AzVMCreateDate { [CmdletBinding()] param ( [parameter(Mandatory=$true, ValueFromPipelineByPropertyName=$true)] [string] $ResourceGroupName, # Resource Group Name [parameter(Mandatory=$true, ValueFromPipelineByPropertyName=$true)] [string] $Name, # VM Name [parameter(Mandatory=$false, ValueFromPipeline=$true)] [System.Object[]] $VMObject # VM Object ) Begin { # Check if the VM Object is from the pipeline $IsItVMObject = $null -ne $VMObject # Checking login, if not asking for the login if (($IsItVMObject -eq $false) -and ($null -eq $(Get-AzContext -ErrorAction SilentlyContinue))) { Login-AzAccount } # Output array object $VMArray = @() } Process { # Fetching the VM details from Resource Group Name and VM Name if provided if ($IsItVMObject -eq $false) { $VMObject = Get-AzVM -ResourceGroupName $ResourceGroupName -Name $Name } foreach ($VM in $VMObject) { # Get the OS Disk Name $VMDiskName = $VM.StorageProfile.OsDisk.Name # Get the Disk Info $VMDiskInfo = Get-AzDisk -ResourceGroupName $VM.ResourceGroupName -DiskName $VMDiskName # Get disk create date & time $VMCreatedDate = $VMDiskInfo.TimeCreated # Add result to the output array $VMArray += New-Object -TypeName psobject -Property @{ ResourceGroup = $VM.ResourceGroupName VMName = $VM.Name CreateDate = $VMCreatedDate } } } End { # Output return ($VMArray | Select-Object ResourceGroup, VMName, CreateDate) } } <# Load the function PS /> . ./Get-AzVMCreateDate.ps1 # on Linux PS \> . .\Get-AzVMCreateDate.ps1 # on Windows #> <# Calling the function PS > Get-AzVMCreateDate #> # Login to your Azure Subscription #az login # Declare variables resourceGroupName='LINUX-RG' vmName='ubuntu01' # Get VM OS disk name vmdiskname=$(az vm show --resource-group $resourceGroupName --name $vmName -d --query "storageProfile.osDisk.name" -o tsv) # Get VM OS disk create date createDate=$(az disk show --resource-group LINUX-RG --name $vmdiskname --query "timeCreated" -o tsv) # Convert the date to readable format createDatef=$(date -d $createDate '+%Y/%m/%d %T') # Output printf "%-20s | %-20s | %-20s\n" "$resourceGroupName" "$vmName" "$createDatef"
One of my friends who is working for a retail store company as a sysadmin was seeking my help in finding the windows server activation status across his organization. Of course, there are many ways to fetch the windows activation status and there are plenty of tools are available online. But I have used CIM Instance with WIM class ‘SoftwareLicensingProduct’ to fetch the activation status for the given server list, and here is the code snippet… Function Get-WinSrvFromInv { <# The purpose of this function is to retrieve the list of servers for which you want to check the Windows Activation status. Write your piece of code to retrieve the servers from your inventory. for example, Get-Content -Path $InvPath\Server.txt #> return @("Srv2K19", "Srv2K16", "Srv2K12") } Function Get-WindowsActivation { <# This is a PowerShell advanced function to retrieve the Windows Activation Status using CIM classes. It takes one or more servers as an input, and it accepts through pipeline as well This script will check the connectivity, and then checks the activation status It collects the info from all the servers and then return the value of error at once. #> [CmdLetBinding()] Param ( [Parameter(Mandatory=$true,ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)] [string[]] $ComputerName ) Begin { $ActivationStatus = @() } Process { foreach ($CN in $ComputerName) { $PingStatus = Test-Connection -ComputerName $CN -ErrorAction Stop -Count 1 -Quiet $SPL = Get-CimInstance -ClassName SoftwareLicensingProduct -ComputerName $CN -Filter "PartialProductKey IS NOT NULL" -ErrorAction Stop $WinProduct = $SPL | Where-Object Name -like "Windows*" $Status = if ($WinProduct.LicenseStatus -eq 1) { "Activated" } else { "Not Activated" } if ($PingStatus -ne $true) { $PingStatus = "No" $Status = "Error" } else { $PingStatus = "Yes" } $ActivationStatus += New-Object -TypeName psobject -Property @{ ComputerName = $CN Status = $Status IsPinging = $PingStatus } } } End { return $($ActivationStatus | Select-Object -Property ComputerName, IsPinging, Status) } } ## Invoke the functions. Get-WinSrvFromInv | Get-WindowsActivation And you will get the status as below…
Ever since I started working with Windows Terminal for my PowerShell work, I feel like I live in a royal world, I just simply love it. Windows Terminal is a terminal emulator for Windows 10 starting from version 18362.0 or higher, and it supports Windows PowerShell, PowerShell Core, Command Prompt, Windows Subsystem for Linux (WSL) & Azure Cloud Shell. As of now, it is not detecting PowerShell 7 Preview but you can add it manually. Windows Terminal enabled with the rich tabbed view and can easily switch between the consoles/terminals with pre-defined short keys. I have customized my Windows terminal pretty much to go along with PowerShell Core as my default terminal and of course along with all other terminals as well. Windows Terminal Installation Windows Terminal is available via Windows Store, and you can download it by clicking here. It can only install on Windows 10 version stating from 18362.0 or higher, and still it is in the preview version and an open-source project available on GitHub. Now let’s see how I customized for my Windows Terminal for PowerShell Core along with all other terminals… Windows Terminal first look right after the installation… It detected PowerShell Core, Windows PowerShell, Command Prompt and Azure Cloud Shell. Since I have already installed Ubuntu, Debian & Kali-linux WSL, they are also showing up along with the other terminals. Windows Terminal Customization By default, it will pick up Windows PowerShell, PowerShell Core, Command Prompt and any WSL is already installed & Azure Cloud Shell, but it will not detect PowerShell 7 Preview. However, you can add it manually… All the settings and configurations happen through JSON format associated with the windows terminal available in the AppData local folder, and you can take a look here. Configuration 1: Add PowerShell 7 Preview You can open the settings file by pressing ctrl+, or click on the + v button on top of the Windows Terminal and then click on Settings, and the JSON file will be opened with your default code editor. If you take a look at the JSON file, every terminal has a separate profile with a bunch of settings to be modified as per your need, and the profile for PowerShell Core looks this… { "acrylicOpacity" : 0.5, "closeOnExit" : true, "colorScheme" : "Campbell", "commandline" : "C:\\Program Files\\PowerShell\\6\\pwsh.exe", "cursorColor" : "#FFCCFF", "cursorShape" : "bar", "fontFace" : "Consolas", "fontSize" : 10, "guid" : "{574e775e-4f2a-5b96-ac1e-a2962a402336}", "historySize" : 9001, "icon" : "ms-appx:///ProfileIcons/{574e775e-4f2a-5b96-ac1e-a2962a402336}.png", "name" : "PowerShell Core", "padding" : "0, 0, 0, 0", "snapOnInput" : true, "startingDirectory" : "%USERPROFILE%", "useAcrylic" : false }, From the settings above, you can you customize the settings like Opacity, Colors, Font and pretty much all you can set as per your need. Okay now, by using the above profile let’s add PowerShell 7 Preview to Windows Terminal, and the profile looks like this which you can add in the JSON file under profiles section… { "acrylicOpacity" : 0.5, "closeOnExit" : true, "colorScheme" : "Campbell", "commandline" : "C:\\Program Files\\PowerShell\\7-preview\\pwsh.exe", "cursorColor" : "#FFCCFF", "cursorShape" : "bar", "fontFace" : "Consolas", "fontSize" : 10, "guid" : "{77e7e60a-38e6-45fb-9c5a-44510c02c4e4}", "historySize" : 9001, "icon" : "C:\\Program Files\\PowerShell\\7-preview\\assets\\Powershell_av_colors.ico", "name" : "PowerShell 7 Preview", "padding" : "0, 0, 0, 0", "snapOnInput" : true, "startingDirectory" : "%USERPROFILE%", "useAcrylic" : false }, If you notice the settings from the above, I have modified commandline, guid, icon and name. Generate the new guid using New-Guid PowerShell CmdLet or {}.id. That’s all, the settings will be applied right after the file is saved, and it doesn’t require to close and reopen the terminal to take any effect, it’s on the fly, you can see the PowerShell 7 Preview in the tab drop-down list including the PowerShell 7 icon. Configuration 2: Setting the background image I like the most in the Windows Terminal is the background image, and it gives me a different and pleasant feel when I am working with, now let’s see how to add the background image… "backgroundImage" : "ms-appdata:///roaming/yourimage.jpg", "backgroundImageOpacity" : 0.75, "backgroundImageStrechMode" : "fill", the above 3 keys will do the magic for you, give the path of the image, set the opacity and the strech mode. You can use image or gif as well, and the strech mode you can use uniformToFillbesides fill. My settings are as below… "backgroundImage" : "C:\\Users\\kiran\\AppData\\Roaming\\Microsoft\\Windows\\PowerShell\\SurrealWallpaper.jpg", "backgroundImageOpacity" : 0.25, "backgroundImageStrechMode" : "fill", And it looks this… You can still play around with the GIF images, Opacity settings, and color schemas. Configuration 3: Setup the default terminal Since I work with PowerShell especially PowerShell 7 Preview, I want to make my PowerShell 7 preview is my default terminal when I open the Windows Terminal. Let’s see how it can be done… Go to settings, and find the key defaultProfileunder globalssection… "defaultProfile" : "{2d647fbe-310d-4d05-852f-8f664e6f490c}", Currently, PowerShell Core is my default profile, and to make PowerShell & preview as default, you need to replace the guid of PowerShell Core with guid of PowerShell 7 Preview (You can find it under PowerShell 7 Preview profile) in the defaultProfile. Configuration 4: Other Settings Under each profile, there are various settings to change the behavior of the terminal, now let’s see a few… "acrylicOpacity" : 0.5, "useAcrylic" : true These settings will apply background transparency to the terminal window. Note this setting will work only on physical machines, not on VMs. "colorScheme" : "Campbell" Color Scheme will change the look & feel of the terminal by applying different colors, by default there are 5 color schemes are available in the settings file, and also you can add different scheme of your own choice. "fontSize" : 14, You can set the default font size, and you can change the font size by using ctrl + mouse wheel, like scroll up increases and scroll down decreases the font size directly from the terminal itself. There are few other where you can give it a try. In addition to the above customization, I tweaked my profile inside the PowerShell and now it looks like this when I open my Windows Terminal every time… You can glance at my sample profile here.
A simple REST API get method call to retrieve the datetime from http://worldclockapi.com service. <# This script returns the current date time from http://worldclockapi.com/ using REST API service. You can find the latest uri from the site above. Eastern Standard Time http://worldclockapi.com/api/json/est/now Coordinated Universal Time http://worldclockapi.com/api/json/utc/now Also supports JSONP Central European Standard Time http://worldclockapi.com/api/jsonp/cet/now?callback=mycallback #> # utc time url [string] $WorldClockAPIUrl = 'http://worldclockapi.com/api/json/utc/now' # Invoke Get method. The API returns the output in json format, but by default Invoke-RestMethod will convert from JSON to readable format (pacustomobject) [psobject] $ApiResult = Invoke-RestMethod -Method Get -Uri $WorldClockAPIUrl <# Selecting only current datetime from the api output $id : 1 currentDateTime : 2019-02-27T11:51Z utcOffset : 00:00:00 isDayLightSavingsTime : False dayOfTheWeek : Tuesday timeZoneName : UTC currentFileTime : 131957418910000000 ordinalDate : 2019-58 serviceResponse : #> [string] $UTCTimeString = $ApiResult.currentDateTime # Convert the string to datetime using .Net datetime class method Parse(), and returns datetime in default culture [datetime]$DateTime = [System.DateTime]::Parse($UTCTimeString) # output datetime return $DateTime
All the IT administrators they work with remote computers as part of their regular activities, and most of the time they use explicit remoting by mentioning the -Computername parameter to the cmdlets, and most commonly used cmdlet is Invoke-Command along with the other cmdlets like Get-Service, Get-Process, Get-WmiObject, Get-EventLog and etc… Invoke-Command -ComputerName Server -ScriptBlock { $env:COMPUTERNAME } Get-Service -ComputerName Server -Name servicename Get-Process -ComputerName Server -Name processname Get-WmiObject -Class Win32_OperatingSystem -ComputerName Server Get-EventLog -ComputerName Server -LogName Application -Newest 5 What is Implicit Remoting in PowerShell? Bring the modules installed on the remote computer and import them on your local computer to run the cmdlets natively as if they are installed on your local computer. But in fact, the cmdlets run against the remote computer from which the module is imported. We need to establish a connection to the remote computer, then load the required modules into the remote session and then export the session modules to our local computer. Establish the remote computer session Open a PowerShell session with elevated privileges and create a new pssession to the remote computer from which you want to import the module… $dcsession = New-PSSession -ComputerName DC2K16 The session output will be as below… $dcsession Id Name ComputerName ComputerType State ConfigurationName Availability -- ---- ------------ ------------ ----- ----------------- ------------ 1 WinRM1 DC2K16 RemoteMachine Opened Microsoft.PowerShell Available The above command to establish the connection to the remote computer works with the computer connected to the same domain, if it is not connected to the domain or from a different domain you need to add the remote computer name or IP to your local computer’s WinRM TrustedHosts list, and pass the credentials using -Credential parameter to the cmdlet. Load the required module into the remote session Basically, I want to connect to my DC server and export the ActiveDirectory module to my local session… Invoke-Command -Session $dcsession -ScriptBlock { Import-Module -Name ActiveDirectory } The module is loaded in the remote session and it is ready to export into the local PowerShell session. Export the module from remote session to the local session You can export the module with a different name altogether and also add a prefix to the cmdlets at the time of loading the module. When you export the module it will be created under $env:PSModulePath. Export-PSSession -Session $dcsession -CommandName *-AD* -OutputModule RemoteDCModule -AllowClobber Directory: C:\Users\kiran\Documents\WindowsPowerShell\Modules\RemoteDCModule Mode LastWriteTime Length Name ---- ------------- ------ ---- -a---- 05-02-2019 10:32 PM 99 RemoteDCModule.format.ps1xml -a---- 05-02-2019 10:32 PM 594 RemoteDCModule.psd1 -a---- 05-02-2019 10:32 PM 396254 RemoteDCModule.psm1 Exporting the module is nothing but the PowerShell creates functions by default with the name same as the cmdlet name to execute on the remote computer where the module is exported from, and then wraps up all these functions into a proper script module(.psm1) ready to import. Import the module and run the cmdlets The exported module is ready to be copied and imported on any computer, provided the computer can be connected to the remote server. When you import the module and run the cmdlet for the first time within the session it will create a new pssession to the remote computer and then executes the command, and the same pssession will be used till the current session alive. # Without prefix Import-Module -Name RemoteDCModule # With prefix Import-Module -Name RemoteDCModule -Prefix FromDC Adding prefix will help us to identify the cmdlets easily, it’s not mandatory though. # On the remote computer Import-Module ActiveDirectory Get-Command -Module ActiveDirectory | Select-Object -First 4 CommandType Name Version Source ----------- ---- ------- ------ Cmdlet Add-ADCentralAccessPolicyMember 1.0.0.0 ActiveDirectory Cmdlet Add-ADComputerServiceAccount 1.0.0.0 ActiveDirectory Cmdlet Add-ADDomainControllerPasswordReplicationPolicy 1.0.0.0 ActiveDirectory Cmdlet Add-ADFineGrainedPasswordPolicySubject 1.0.0.0 ActiveDirectory Notice the CommandType is CmdLet on the remote computer when you actually import the module. # On the local computer, import the exported module with prefix Import-Module RemoteDCModule -Prefix FromDC Get-Command -Module RemoteDCModule | Select-Object -First 4 CommandType Name Version Source ----------- ---- ------- ------ Function Add-FromDCADCentralAccessPolicyMember 1.0 RemoteDCModule Function Add-FromDCADComputerServiceAccount 1.0 RemoteDCModule Function Add-FromDCADDomainControllerPasswordReplication... 1.0 RemoteDCModule Function Add-FromDCADFineGrainedPasswordPolicySubject 1.0 RemoteDCModule Here the CommandType is Function and also notice the prefix FromDC in the noun. As I mentioned earlier, PowerShell creates a function to execute the actual command using Invoke-Command on the remote computer. If you want to know more about how the function is created, then pick any function from the exported module and see the definition of it using the command below… # Syntax (Get-Command -Name <function-name>).Definition #Example (Get-Command -Name Get-FromDCADUser).Definition Execute the CmdLets Since the exported module is saved in the local filesystem, you don’t need to create the remote computer session every time you execute the commands, the module will take care of establishing the connection to the remote computer and execute the commands against. So you can remove the remote session… Remove-PSSession -Session $dcsession Now, run any command and notice that there is a new session will be created (highlighted below) So finally to justify the title of this post, you can run the cmdlets without installing the modules on the local computer. Especially it is much helpful on PowerShell Core, because some modules work with Windows PowerShell don’t work with PowerShell Core.
Prior to Windows PowerShell 5.0, if you want to zip or unzip the files you have to depend on COM objects, but from version 5.0 onwards (it is even available in PowerShell Core as well) there are two new cmdlets Compress-Archive and Expand-Archive are introduced to zip & unzip the files respectively. Examples: ### Examples are from Microsoft Docs ## Zip the files # Example 1: Create an archive file Compress-Archive -LiteralPath C:\Reference\Draftdoc.docx, C:\Reference\Images\diagram2.vsd -CompressionLevel Optimal -DestinationPath C:\Archives\Draft.Zip # Example 2: Create an archive with wildcard characters Compress-Archive -Path C:\Reference\* -CompressionLevel Fastest -DestinationPath C:\Archives\Draft # Example 3: Update an existing archive file Compress-Archive -Path C:\Reference\* -Update -DestinationPath C:\Archives\Draft.Zip # Example 4: Create an archive from an entire folder Compress-Archive -Path C:\Reference -DestinationPath C:\Archives\Draft ## Unzip the file # Example 1: Extract the contents of an archive Expand-Archive -LiteralPath C:\Archives\Draft.Zip -DestinationPath C:\Reference # Example 2: Extract the contents of an archive in the current folder Expand-Archive -Path Draft.Zip -DestinationPath C:\Reference
What is a static website? By the definition, the static website contains web pages with fixed content and displays the same content to every user on every visit, and static websites are very basic and easy to create. Now static websites can be built using HTML, CSS and JavaScript and hosted on Azure Storage, and support client-side script execution but not server-side, if the static website needs some data processing or manipulation on server-side you can leverage it to Azure Cognitive Services or Azure Functions. What is the use? Static websites are very useful when it doesn’t require high bandwidth, backend support and targeted to limited audiences and mostly for the shorter duration of time. Some useful areas are… Explain about the project and the road map. Just for the sake of presentation in the meeting, create some html documents with the necessary content, upload them to the Azure blob storage and simply access the url from anywhere in the world. Showcase about the products, events and promotions. Sales and marketing teams require nice and colorful web pages to walk through the concepts, so build a website using CSS & HTML, publish it on to Azure blob storage and share the link with the intended audience. Technical Documents & Manuals Host some technical documentation and manuals relating to your project, and share it with the team across the globe for their perusal. How it works? When the static website service is enabled on a Azure storage account you need to enter the default document name; and error document name as optional and when the feature is enabled a container named $web will be created if it doesn’t already exist to upload your website files. Files in the $web container are read only, case sensitive and available to access anonymously… How to access? Available to the public web following this pattern: https://<ACCOUNT_NAME>.<ZONE_NAME>.web.core.windows.net/<FILE_NAME> Available through a Blob storage endpoint following this pattern: https://<ACCOUNT_NAME>.blob.core.windows.net/$web/<FILE_NAME> It can also available with CDN and SSL support and custom domain name as well. What is the pricing? Azure Static website feature is free; the pricing is only for storage. But in addition to the storage costs, data egress will apply and in case Azure CDN is enabled to use a custom domain with an SSL certificate, that will also be applicable. How to enable the Static Website feature and host a website using Azure PowerShell? All you need is a valid Azure subscription and follow the steps (in PowerShell)… Login into Azure account. Select the required subscription. Select/Create a Resource Group. Select /Create a Storage Account (StorageV2). Set the current storage account to enable the Static Website feature. Enable the Static Website feature on the current storage account. Upload the website files to the blob storage container ($web). Verify the files uploaded successfully. Retrieve the URL to access the static website. #requires -Module Az ## Ensure logged into your Azure account if([string]::IsNullOrEmpty($(Get-AzContext))) { Add-AzAccount } ## Define the required variables $SubscriptionId = '<SubscriptionId>' # This is your subscription id (ex: 'f34d6539-c45b-4a93-91d9-0b4e6ffb6030') $ResourceGroupName = 'static-websites-rg' # Resource Group $Location = 'southindia' # Location $StorageAccountName = 'staticwebsitesa999' # Storage Account $WebpagePath = "C:\wwwroot\" # Static website files ## Select the required subscription, in case there multiple subscriptions Select-AzSubscription -Subscription $SubscriptionId ## Select/Create Azure resource group # Parameters $ParamList = @{ Name = $ResourceGroupName Location= $Location } # Create the resource group if it doesn't exist $ResourceGroup = Get-AzResourceGroup @ParamList -ErrorAction SilentlyContinue if ($null -eq $ResourceGroup) { New-AzResourceGroup @ParamList } ## Select/Create storage account # Parameters $ParamTable = @{ Name = $StorageAccountName ResourceGroupName = $ResourceGroupName } # Create the storage account if it doesn't exist $StorageAccount = Get-AzStorageAccount @ParamTable -ErrorAction SilentlyContinue if ($null -eq $StorageAccount) { $ParamTable.Location = $Location $ParamTable.SkuName = 'Standard_LRS' $ParamTable.Kind = 'StorageV2' $ParamTable.AccessTier = 'Hot' New-AzStorageAccount @ParamTable } ## Parameters required to use with storage Cmdlets $ParamTable = @{ Name = $StorageAccountName ResourceGroupName = $ResourceGroupName } ## Set the storage account to enable the static website feature Set-AzCurrentStorageAccount @ParamTable ## Enable the static website feature for the selected storage account # Ensure the documents are created with the names mentioned Enable-AzStorageStaticWebsite -IndexDocument "index.html" -ErrorDocument404Path "error.html" ## Upload the website pages to the azure blob container Get-ChildItem -Path $WebpagePath -Recurse | Set-AzStorageBlobContent -Container '$web' ## Verify the files uploaded to the azure blob container Get-AzStorageContainer -Name '$web' | Get-AzStorageBlob ## Retrieve the public URL to access the static website (Get-AzStorageAccount @ParamTable).PrimaryEndpoints.Web ## Add custom domain to your static website, but need to add CNAME record in your domain dns server Set-AzStorageAccount @ParamTable -CustomDomainName "www.yourdomain.com" -UseSubDomain $True With the glory of GitHub public repositories, I have cloned a simple website and created my profile page just like that and hosted it on my Azure Storage.
Many a times we might have come across the situation where we need to execute the lengthy CmdLets/functions having bunch of parameters and exceeds screen width and wrapped down to the next line or need to scroll towards end of the command or need to ignore the new line using the escape character ( ). Splatting, pass the parameter values as a collection in the form of name and value pair, as a hash table or array of values. It makes the command shorter, easy to read and can be re-used. It’s a hash table/array variable though, to pass the parameter values to the command @ symbol will be used before the variable name instead of $. SYNTAX $paramtable = @{ Name1 = 'Value1' Name2 = 'Value2' Name3 = 'Value3' } C:\> Sample-Command @paramtable or C:\> Sample-Command <optional parameters> @paramtable <optional parameters> To provide the named parameter values hash table can be used and to provide the positional parameters array can be used. When splatting, it is not necessary to use either hash table or an array only to pass the parameters, positional parameters and/or named parameters can also be used along with. EXAMPLE: Splatting with hash table Create a new file using New-Item CmdLet by passing necessary parameters… # Along with the named parameters New-Item -Path C:\Windows\Temp\ -Name Delete.txt -ItemType File -Value "Hello World!" -Force # With hash table $paramtable = @{ Path = 'C:\Windows\Temp\' Name = 'Delete.txt' ItemType = 'File' Value = 'Hello World!' Force = $true } New-Item @paramtable EXAMPLE: Splatting with array Copy a file from one location to other using Copy-Item CmdLet by passing necessary parameters… # Copy a file using named parameters Copy-Item -Path $env:windir\Temp\CSV1.csv -Destination $env:TEMP\CSV1.csv -Force # With array $paramarray = @("$env:windir\Temp\CSV1.csv", "$env:TEMP\CSV1.csv") Copy-Item @paramarray -Force An another example… Function Create-NewItem { [CmdLetBinding(SupportsShouldProcess)] param ( [parameter(mandatory=$true,parametersetname="Path")] [parameter(mandatory=$false,parametersetname="Name")] [string]$Path, [parameter(mandatory=$true,parametersetname="Name")] [string] $Name, [parameter(mandatory=$false,parametersetname="Path")] [parameter(mandatory=$false,parametersetname="Name")] [string]$ItemType, [parameter(mandatory=$false,parametersetname="Path")] [parameter(mandatory=$false,parametersetname="Name")] [object] $Value, [parameter(mandatory=$false,parametersetname="Path")] [parameter(mandatory=$false,parametersetname="Name")] [switch]$Force, [parameter(mandatory=$false,parametersetname="Path")] [parameter(mandatory=$false,parametersetname="Name")] [pscredential] $Credential ) Write-Host "Creating a new $($ItemType.ToLower())" New-Item @PSBoundParameters | Out-Null if ($?) {Write-Host "New $($ItemType.ToLower()) has been created successfully"} } $paramtable = @{ Path = "C:\Temp\" ItemType = "Directory" } Create-NewItem @paramtable <# PS C:\> Create-NewItem @paramtable Creating a new directory New directory has been created successfully #> #Run the script again Create-NewItem @paramtable <# PS C:\> Create-NewItem @paramtable Creating a new directory New-Item : An item with the specified name C:\Temp already exists. At line:25 char:5 + New-Item @PSBoundParameters | Out-Null + ~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : ResourceExists: (C:\Temp:String) [New-Item], IOException + FullyQualifiedErrorId : DirectoryExist,Microsoft.PowerShell.Commands.NewItemCommand #> $paramtable.Force = $true <# PS C:\> $paramtable Name Value ---- ----- Path C:\Temp\ Force True ItemType Directory #> #Run the script again Create-NewItem @paramtable <# PS C:\> Create-NewItem @paramtable Creating a new directory New directory has been created successfully #> $paramtable.Name = 'Test.txt' $paramtable.ItemType = 'File' $paramtable.Remove("Force") <# PS C:\> $paramtable Name Value ---- ----- Path C:\Temp\ Name Test.txt ItemType File #> #Run the script again Create-NewItem @paramtable <# PS C:\> Create-NewItem @paramtable Creating a new file New file has been created successfully #>
Sometimes in the interactive session when we are executing the scripts, and where it requires an input/consent from the user to proceed further we can prompt the user to choose from the given choices… # PromptForChoice Args $Title = "Do you want to proceed further?" $Prompt = "Enter your choice" $Choices = [System.Management.Automation.Host.ChoiceDescription[]] @("&Yes", "&No", "&Cancel") $Default = 1 # Prompt for the choice $Choice = $host.UI.PromptForChoice($Title, $Prompt, $Choices, $Default) # Action based on the choice switch($Choice) { 0 { Write-Host "Yes - Write your code"} 1 { Write-Host "No - Write your code"} 2 { Write-Host "Cancel - Write your code"} }
When selecting properties using Select-Object, sometimes we may need to fetch the values in a more meaningful and understandable format, and in some cases, conditional output may be needed to get the precise output, you can achieve the same by using the expressions directly in the same select statement or using splatting… # Get the total memory in GB from the local computer using the calculated property with Select-Object Get-CimInstance -ClassName Win32_OperatingSystem | Select-Object -Property PSComputerName, ` @{Name = 'Memory in GB'; Expression = {[Math]::Round($_.TotalVisibleMemorySize/1MB)}} <# Output PSComputerName Memory in GB -------------- ------------ Workstation 8 #> # Get the services where the names are starting with App, and display IsRunning with Yes/No using the calculated property $IsRunning = @{ Label = "IsRunning" Expression = { if($_.Status -eq 'Running') { "Yes" } else { "No" } } } Get-Service -Name App* | Select-Object -Property Name, DisplayName, $IsRunning <# Output Name DisplayName IsRunning ---- ----------- --------- AppIDSvc Application Identity No Appinfo Application Information Yes AppMgmt Application Management No AppReadiness App Readiness No AppVClient Microsoft App-V Client No AppXSvc AppX Deployment Service (AppXSVC) No #>
You know why Excel is still one of the popular reporting tools because it is very easy to use, customizable as per your needs and moreover it is interactive. There are many ways that you can generate excel reports programmatically, but especially when you are using a scripting language like PowerShell there are tons of modules are already available with sophisticated features in the public repositories to reuse in your code, and they are very easy and simple to use. Please be careful when you are installing the modules from the public repositories, because people around the world they publish their code into the repositories which work for them as per their environment and settings, but it may harmful for your environment. So please ensure the scripts work fine for you in your test environment first and then use it in your production environment. ImportExcel PowerShell Module by Doug Finke from PowerShellGallery is a very popular and much helpful module, it works even without Excel installed on your computer, and again it’s an open source project in GitHub. This module is very rich in features and compact to use in your code, and I find this module is very useful and helpful. Now let’s get started with ImportModule module in PowerShell Core … Since PowerShellGallery is a default repository in PowerShell, you don’t need to set the repository again, just ensure that you have PSGallery as a default repository… Get-PSRepository Now it’s time to get the module installed and imported to the session… # Find the module Find-Module -Name ImportModule # Install the modules Find-Module -Name ImportModule | Install-Module Install-Module -Name Install-Module # Update the module Update-Module -Name ImportExcel # Verify the module is installed Get-Module -Name ImportExcel -ListAvailable # Import the module Import-Module -Name ImportExcel To check the CmdLets available in the ImportExcel module, run the CmdLet below… Get-Command -Module ImportExcel Now let’s see how to export the data to excel and various options… Export-Excel CmdLet will do all the magic with various parameters; to simply export the data to excel, just pipe the output to Export-Excel, this will export the data to excel, apply filters, auto size the columns and pop up the excel window, but this will not save the file to disk. Get-Service | Select Name, DisplayName, Status | Export-Excel … just to export the data to excel and save the file to disk, use -Path flag with Export-Excel… Get-Service | Select Name, DisplayName, Status | Export-Excel -Path C:\Test.xlsx Observe the data in the excel opened after the file was created, where the columns are compact, not readable and no filters applied. By default without any parameters Export-Excel CmdLet will not save the file to disk, show the window, apply filters, auto size the columns, but if we use -Path and want to pop up the window, apply filters and auto size the columns we need to use the -Show, -AutoSize, -AutoFilter flags… # To Show the window after the file save to disk Get-Service | Select Name, DisplayName, Status | Export-Excel -Path .\Test.xlsx -Show # To apply filters, and allow auto size the columns Get-Service | Select Name, DisplayName, Status | Export-Excel -Path .\Test.xlsx -Show -AutoSize -AutoFilter Now let’s see formatting the text in the excel reports… # Get the services exported to excel and highlight the services state separately for services running and services are stopped. $ConTxt1 = New-ConditionalText -Text 'Stopped' -ConditionalTextColor Red -BackgroundColor Yellow $ConTxt2 = New-ConditionalText -Text 'Running' -ConditionalTextColor Yellow -BackgroundColor Green Get-Service | Select Status, Name, DisplayName | Export-Excel -Path .\Test.xlsx -AutoSize -Show -ConditionalFormat $ConTxt1, $ConTxt2 # '-ConditionalFormat' parameter accepts arrays Setting the icons to the values to represent the changes with in the given range… # Get the processes, and represent the changes in the memory with the icons $ConFmt = New-ConditionalFormattingIconSet -Range "C:C" -ConditionalFormat FiveIconSet -IconType Arrows Get-Process | Select Company, Name, PM, Handles | Export-Excel -Path .\Process.xlsx -Show -AutoSize -AutoFilter -ConditionalFormat $ConFmt # Also club it with the conditional text $ConTxt = New-ConditionalText -Text 'Microsoft' -ConditionalTextColor Yellow -BackgroundColor Green Get-Process | Select Company, Name, PM, Handles | Export-Excel -Path .\Process.xlsx -Show -AutoSize -AutoFilter -ConditionalFormat $ConFmt, $ConTxt Now let’s see some creating pivot tables and charts… # Get the services and identify the number of service are running & stopped and the services count per start type $Data = Get-Service | Select-Object Status, Name, DisplayName, StartType | Sort-Object StartType # Parmaters in a hashtable $Param = @{ Show = $true AutoSize = $true IncludePivotTable = $true PivotRows = 'StartType' PivotData = 'StartType' PivotColumns = 'Status' } # Create the pivot table $Data | Export-Excel -Path C:\GitRepo\Test.xlsx @Param # Get the services and identify the number of service are running & stopped and the services count per start type $Data = Get-Service | Select-Object Status, Name, DisplayName, StartType | Sort-Object StartType # Parmaters in a hashtable $Param = @{ Show = $true AutoSize = $true PivotRows = 'StartType' PivotData = 'StartType' IncludePivotChart = $true ChartType = 'PieExploded3D' } # Create the pivot charts $Data | Export-Excel -Path C:\GitRepo\Test.xlsx @Param There are plenty of options are available, so please explore the all the features in the ImportExcel and make the best use of this module. You can also achieve the same by writing your own code, but this is very compact and easy to use. Many thanks to Doug Finke! #ImportExcel
Recently I have upgraded to PowerShell Core and slowly switching from Windows PowerShell to PowerShell Core. I have noticed quite a few CmdLets are missing in the PowerShell Core, since it became an open source and supports on cross-platform most of the platform dependent CmdLets won’t work on the other platforms. I usually clear my temp folders and recyclebin in all my computers frequently, and noticed Clear-RecycleBin CmdLet is not a valid CmdLet in PowerShell Core… Clear-RecycleBin : The term 'Clear-RecycleBin' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again. At line:1 char:1 + Clear-RecycleBin -Force + ~~~~~~~~~~~~~~~~ + CategoryInfo : ObjectNotFound: (Clear-RecycleBin:String) [], CommandNotFoundException + FullyQualifiedErrorId : CommandNotFoundException So I have decided to create a PowerShell function to achieve almost the the same functionality of Clear-RecycleBin using .NET class, and here is the function… Function Empty-RecycleBin { [CmdletBinding(SupportsShouldProcess=$true, ConfirmImpact = 'High')] param ( [Parameter(Mandatory=$false)] [switch] $Force # Without confirmation ) if($IsWindows -eq $false) { return } # Exit the script if the OS is other than Windows # Since the Crear-RecycleBin CmdLet is not availble on PowerShell Core, # achive the same functionality using the .Net Classes. $Type = @' using System; using System.Runtime.InteropServices; namespace MyComputer { public static class RecycleBin { [DllImport("Shell32.dll", CharSet = CharSet.Unicode)] static extern uint SHEmptyRecycleBin(IntPtr hwnd, string pszRootPath, int dwFlags); public static void Empty() { SHEmptyRecycleBin(IntPtr.Zero, null, 1); } } } '@ Add-Type -TypeDefinition $Type # Bypass confirmation, and empty the recyclebin if ($PSBoundParameters.ContainsKey('Force')) { [MyComputer.RecycleBin]::Empty() return } # Default behaviour, with confirmation empty the recyclebin if($PSCmdlet.ShouldProcess('All of the contents of the Recycle Bin','Empty-RecycleBin')){ [MyComputer.RecycleBin]::Empty() return } } Output: I have not added the -DriveLetter flag, since I want to clear the recyclebin from all the drives, if you want clear the recyclebin from a specific drive, you need add the driveLetter argument to the Empty method in the C# code and add the -DriveLetter parameter to the PowerShell function.
The code below will download the .zip file from the internet, then extracts the files from the zip and opens the extracted folder… $Url = 'https://download.sysinternals.com/files/BGInfo.zip' $ZipFile = 'C:\ZipFolder\' + $(Split-Path -Path $Url -Leaf) $Destination= 'C:\Extracted\' Invoke-WebRequest -Uri $Url -OutFile $ZipFile $ExtractShell = New-Object -ComObject Shell.Application $Files = $ExtractShell.Namespace($ZipFile).Items() $ExtractShell.NameSpace($Destination).CopyHere($Files) Start-Process $Destination
Deploying the Virtual Machines in the Azure cloud using the templates is the best way to create the VM to satisfy the attributes like quick, consistent, reusable and handles the dependency among the resources. However PowerShell has the flexibility to deploy the VMs with ease and allows the user to choose the required parameters necessary for the particular deployment alone without even touching the code. Ofcourse this approach is also quick and reusable, but the user has to ensure the consistency and dependency among the resources if required while creating the resources in the Azure cloud. Since the new Azure PowerShell Module Az 1.0.1 is released, I have written the scripts using the Az module CmdLets. So please install Az module on Windows PowerShell or PowerShell Core, import the module and connect to Azure account using Connect-AzAccount. Add required parameters to the script… #requires -Modules Az param ( [Parameter(Mandatory=$true)] [string] $ResourceGroupName, # Resource Group [Parameter(Mandatory=$true)] [string] $VMName, # VM Name [Parameter(Mandatory=$true)] [string] $Location, # Location [Parameter(Mandatory=$true)] [ValidateSet('Windows','Linux')] [string] $OSType, # OS Type (Windows/Linux) [Parameter(Mandatory=$true)] [string] $VirtualNetworkName, # VNet [Parameter(Mandatory=$true)] [string] $SubnetName, # Subnet [Parameter(Mandatory=$true)] [string] $SecurityGroupName, # NSG [Parameter(Mandatory=$false)] [string] $VMSize, # VM Size [Parameter(Mandatory=$false)] [switch] $AssignPublicIP, # Assign PIP [Parameter(Mandatory=$false)] [pscredential]$VMCredential, # VM login credential [Parameter(Mandatory=$false)] [Int[]] $AllowedPorts # NSG rules ) Ensure you are connected to Azure subscription, if the script exits then connect to Azure subscription using Connect-AzAccount CmdLet, and this is a browser-based authentication. # Verify Login if( -not $(Get-AzContext) ) { return } Ensure that there is no existing vm with the same name in the resource group. If there is a VM already exists then exit the script. # Verify VM doesn't exist $VM = Get-AzVM -ResourceGroupName $ResourceGroupName -Name $VMName -ErrorAction SilentlyContinue if($null -ne $VM) { return } Create VM login credentials, if not provided along with the script… # Create user object if (-not $PSBoundParameters.ContainsKey('VMCredential')) { [pscredential] $VMCredential = Get-Credential -Message 'Please enter the vm credentials' } # Verify credential if ($VMCredential.GetType().Name -ne "PSCredential") { return } The script identifies the existing resources with the names provided, if exist then they will be used and if they don’t exist then will be created with different names. Two things that you need to choose based on your requirements, one is the VM Size and the other is OS Image (Sku)… # Lists all the VM Sizes available in South India region PS C:\> Get-AzVMSize -Location southindia To retrieve the OS Skus, I have written an another post List of available Azure VM Image skus using new Azure PowerShell module Az , please refer to it… Now the main block starts from here… # Verify/Create a resource group $ResourceGroup = Get-AzResourceGroup -Name $ResourceGroupName -ErrorAction SilentlyContinue if ($null -eq $ResourceGroup) { $ResourceGroup = New-AzResourceGroup -Name $ResourceGroupName -Location $Location } # Verify the virtual network $VNet = Get-AzVirtualNetwork -Name $VirtualNetworkName -ResourceGroupName $ResourceGroup.ResourceGroupName -ErrorAction SilentlyContinue if ($null -eq $VNet) { [Microsoft.Azure.Commands.Network.Models.PSSubnet] $SubnetConfig = New-AzVirtualNetworkSubnetConfig -Name $SubnetName -AddressPrefix 192.168.1.0/24 $VNet = New-AzVirtualNetwork -ResourceGroupName $ResourceGroup.ResourceGroupName -Location $Location -Name $VirtualNetworkName -AddressPrefix 192.168.0.0/16 -Subnet $SubnetConfig } else { $Subnets = Get-AzVirtualNetworkSubnetConfig -VirtualNetwork $VNet $SubnetConfig = $Subnets | Where-Object -FilterScript {$_.Name -eq $SubnetName} if ($null -eq $SubnetConfig) { $VNetAddressPrefixes = $VNet.AddressSpace.AddressPrefixes $AddressPrefix = @($VNetAddressPrefixes.Split('.')) $AddressPrefix[2] = [int]($Subnets.AddressPrefix|Measure-Object -Maximum).Maximum.ToString().Split('.')[2] + 1 $AddressPrefix = $AddressPrefix -join '.' $VNet | Add-AzVirtualNetworkSubnetConfig -Name $SubnetName -AddressPrefix $AddressPrefix | Set-AzVirtualNetwork } } $Subnet = Get-AzVirtualNetworkSubnetConfig -Name $SubnetName -VirtualNetwork $VNet # Create a public IP address and specify a DNS name if ($PSBoundParameters.ContainsKey('AssignPublicIP')) { [string] $PipName = $VMName + '-pip' $VerifyPip = Get-AzPublicIpAddress -Name $PipName -ResourceGroupName $ResourceGroup.ResourceGroupName -ErrorAction SilentlyContinue if ($null -ne $VerifyPip) { $PipName = $VMName + '-pip-' + $(Get-Random).ToString() } $PublicIP = New-AzPublicIpAddress -ResourceGroupName $ResourceGroup.ResourceGroupName -Location $Location -Name $PipName -AllocationMethod Static -IdleTimeoutInMinutes 4 } # Create/Select a network security group $NSG = Get-AzNetworkSecurityGroup -Name $SecurityGroupName -ResourceGroupName $ResourceGroup.ResourceGroupName -ErrorAction SilentlyContinue if ($null -eq $NSG) { # Create an inbound network security group rules if ($PSBoundParameters.ContainsKey('AllowedPorts')) { [System.Array] $NsgRules = @() [int] $Priority = 1000 foreach ($Port in $AllowedPorts) { $Rule = New-AzNetworkSecurityRuleConfig -Name "Allow_$Port" -Protocol Tcp -Direction Inbound -Priority $Priority -SourceAddressPrefix * -SourcePortRange * -DestinationAddressPrefix * -DestinationPortRange $Port -Access Allow $Priority++ $NsgRules += $Rule } $NSG = New-AzNetworkSecurityGroup -ResourceGroupName $ResourceGroup.ResourceGroupName -Location $Location -Name $SecurityGroupName -SecurityRules $NsgRules } else { $NSG = New-AzNetworkSecurityGroup -ResourceGroupName $ResourceGroup.ResourceGroupName -Location $Location -Name $SecurityGroupName } } else { # Add an inbound network security group rules, if missing any if ($PSBoundParameters.ContainsKey('AllowedPorts')) { $NSGAllowedPorts = $NSG.SecurityRules | Where-Object -FilterScript {$_.Access -eq "Allow"} | Select-Object -ExpandProperty DestinationPortRange $PortsToAllow = $AllowedPorts | Where-Object -FilterScript {$_ -notin $NSGAllowedPorts} $Priority = ($NSG.SecurityRules.Priority|Measure-Object -Maximum).Maximum + 100 if ($PortsToAllow.Count -gt 0) { foreach($Port in $PortsToAllow) { $NSG | Add-AzNetworkSecurityRuleConfig -Name "Allow_$Port" -Protocol Tcp -Direction Inbound -Priority $Priority -SourceAddressPrefix * -SourcePortRange * -DestinationAddressPrefix * -DestinationPortRange $Port -Access Allow | Set-AzNetworkSecurityGroup } } } } # Create a virtual network card and associate with public IP address and NSG $NICName = "$VMName-nic" $NIC = Get-AzNetworkInterface -Name $NICName -ResourceGroupName $ResourceGroup.ResourceGroupName -ErrorAction SilentlyContinue if ($null -ne $NIC) { $NICName = $VMName + "-nic-" + $(Get-Random).ToString() } $NIC = New-AzNetworkInterface -Name $NICName -ResourceGroupName $ResourceGroup.ResourceGroupName -Location $Location -SubnetId $Subnet.Id -NetworkSecurityGroupId $NSG.Id if ($PSBoundParameters.ContainsKey('AssignPublicIP')) { $NIC | Set-AzNetworkInterfaceIpConfig -Name $NIC.IpConfigurations[0].Name -PublicIpAddressId $PublicIP.Id -SubnetId $Subnet.Id | Set-AzNetworkInterface | Out-Null } # VM Size if($PSBoundParameters.ContainsKey('VMSize') -eq $false ) { $VMSize = 'Standard_A1' } # OS Type $VMSourceImage = @{PublisherName='';Offer='';Sku=''} switch ($OSType) { 'Windows' { $VMSourceImage.PublisherName = 'MicrosoftWindowsServer' $VMSourceImage.Offer = 'WindowsServer' $VMSourceImage.Sku = '2016-Datacenter' } 'Linux'{ $VMSourceImage.PublisherName = 'Canonical' $VMSourceImage.Offer = 'UbuntuServer' $VMSourceImage.Sku = '18.10-DAILY' } } # Create a virtual machine configuration $VMConfig = New-AzVMConfig -VMName $VMName -VMSize $VMSize if ($OSType -eq 'Windows') { $VMConfig | Set-AzVMOperatingSystem -Windows -ComputerName $VMName -Credential $VMCredential | Out-Null } else { $VMConfig | Set-AzVMOperatingSystem -Linux -ComputerName $VMName -Credential $VMCredential | Out-Null } $VMConfig | Set-AzVMSourceImage -PublisherName $VMSourceImage.PublisherName -Offer $VMSourceImage.Offer -Skus $VMSourceImage.Sku -Version latest | Out-Null $VMConfig | Add-AzVMNetworkInterface -Id $NIC.Id | Out-Null $VMConfig | Set-AzVMBootDiagnostic -Disable | Out-Null # Create a virtual machine New-AzVM -ResourceGroupName $ResourceGroup.ResourceGroupName -Location $Location -VM $VMConfig To create a Windows VM… .\Create-AzVM.ps1 -ResourceGroupName test-rg ` -VMName testvm -Location southindia ` -OSType Windows ` -VirtualNetworkName test-vnet ` -SubnetName testnet ` -SecurityGroupName test-nsg ` -AssignPublicIP ` -AllowedPorts 3389 ` -VMCredential $cred ` -Verbose To create a Linux VM… .\Create-AzVM.ps1 -ResourceGroupName test-rg ` -VMName testvm -Location southindia ` -OSType Linux ` -VirtualNetworkName test-vnet ` -SubnetName testnet ` -SecurityGroupName test-nsg ` -AssignPublicIP ` -AllowedPorts 22 ` -VMCredential $cred ` -Verbose You can find the complete source code on my git repository.
In Azure Cloud, Tags play a major role to manage resources in an easy way, in an other words Tags are an additional meta data associated with the Azure resources. We can assign the tags to the individual resources like VM, Storage Account, VNet and etc., and we can also assign the tags to the Resource Groups as well. Resource groups allow us to organize the related resources together and facilitate the management, but tags are used to group the resources beyond the resource groups including the resource groups, and at the same time resources inside the resource group do not inherit the tags associated with the resource group. Tags are Key and Value combination that can be assigned to the resources in the Azure cloud, for example… Tag Key Tag Value ResourceType VM Project MyProject Department Marketing Environment Production CostCenterCode 123456 Do bear in mind that each individual resource can have up to 15 tags max (Microsoft keeps updating the numbers time to time, so please refer the Microsoft Docs for the exact number), and ensure the tags are unique and consistent naming convention among Azure resources. Tags are used to organize the deployed resources in the Azure cloud, we could search the resources by tag key/value, for example search the resources with the tags associated {Key:Value} Type:VM and Environment:Production, then the search results all the production VMs across the resource groups within a subscription. Tags are also used to view the related resources, like all the resources tagged to a specific project or a specific cost center and to facilitate the billing and cost management. Tags can be created at the time of creating resources or at the later time by using the Azure portal or any command line tools like PowerShell or Azure CLI. Let’s see how we can create and manage the tags using PowerShell… #requires -Module Az # Connect-AzAccount ### Add new tags to an existing resource # Get the resource $Resource = Get-AzResource -Name testvm -ResourceGroupName test-rg # Resource tags [hashtable] $Tags = $Resource.Tags # Ensure not to overwrite the tags if ($null -eq $Tags) { [hashtable] $Tags = @{Type="VM"; Environment="Test"} } else { $Tags += @{Type="VM"; Environment="Test"} } # Add new tags to the resource (-Force to override the confirmation if there are any existing tags) Set-AzResource -ResourceId $Resource.Id -Tag $Tags -Force ### Remove an existing tag / remove all tags from a resource # Get the resource $Resource = Get-AzResource -Name testvm -ResourceGroupName test-rg # Resource tags [hashtable] $Tags = $Resource.Tags # Remove the specific tag $Tags.Remove("Type") # Overwrite the remaining tags to the resource (-Force to override the confirmation if there are any existing tags) Set-AzResource -ResourceId $Resource.Id -Tag $Tags -Force ## Remove all tags Set-AzResource -ResourceId $Resource.Id -Tag @{} -Force ### List all the resources with a specific tag key Get-AzResource -TagName "Environment" ### List all the resources with a specific tag value Get-AzResource -TagValue "Test" ### List all the resources with a specific tag key and value Get-AzResource -Tag @{Environment="Test"} ### List all the tags and number of resources associated in a subscription. Get-AzTag
Sometimes deliberately we don’t create and assign a public ip to an Azure Virtual Machine to not to expose to the internet as a safety measure, but later at some point of time we may require the VM to be accessed via internet and we definitely need a public ip to access the VM, the script below will help to create and assign a public ip address to an Azure VM… If no Network Security Group is associated with Virtual Machine, by default all ports are open to the internet, and please be careful. #requires -Module Az # Function to create and assign a public ip address # to an Azure Virtual Machine using Az PowerShell module. Function Assign-AzVMPublicIP2 { Param ( # Resource Group Name [Parameter(Mandatory=$true)] [string] $ResourceGroupName, # Virtual Machine Name [Parameter(Mandatory=$true)] [string] $VMName ) # Retrieve the Virtual Machine details $VM = Get-AzVM -ResourceGroupName $ResourceGroupName -Name $VMName -ErrorAction SilentlyContinue # Checking the VM existance if($null -eq $VM) { Write-Error "Please enter a valid and existing Resource Group Name and Virtual Machine Name" return } $Location = $VM.Location # Location to create a public ip $NICId = $VM.NetworkProfile.NetworkInterfaces.Id # Network Interface resource id $NICResource = Get-AzResource -ResourceId $NICId # Retrieve the NIC resource details # Retrive the NIC Object $NIC = Get-AzNetworkInterface -Name $NICResource.Name -ResourceGroupName $NICResource.ResourceGroupName $NICIPConfigName = $NIC.ipConfigurations[0].Name # IP Config Name to be used with Set-AzNetworkInterfaceIpConfig CmdLet $NICSubnetId = $NIC.ipConfigurations[0].subnet.id # Subnet id to be used with Set-AzNetworkInterfaceIpConfig CmdLet # Create a public ip $PublicIP = New-AzPublicIpAddress -ResourceGroupName $ResourceGroupName -Location $Location -Name "$VMName-pip" -AllocationMethod Static -IdleTimeoutInMinutes 4 # Warn the user if no NSG is associated with this VM if ($null -eq $NIC.NetworkSecurityGroup) { Write-Warning "Since no Network Security Group is associated with this Virtual Machine, by default all ports are open to the internet." } # Assign the public ip to the VM NIC $NIC | Set-AzNetworkInterfaceIpConfig -Name $NICIPConfigName -SubnetId $NICSubnetId -PublicIpAddressId $PublicIP.Id | Set-AzNetworkInterface } Assign-AzVMPublicIP2 -ResourceGroupName test-rg -VMName test-vm
This terraform configuration will deploy a simple Windows VM on Azure cloud, and this is a conversion of 101-vm-simple-windows from azure_quickstart_templates. This configuration consists of two .tf files (variables.tf and main.tf) and deploy a single vm with the following resources… Visualized in ARMVIZ Deployed resources Configuration variables.tf Variables are separated from the main configuration, and this configuration accepts the variable below… provider "azurerm" { version = "=1.36.0" } variable "resourceGroupName" { type = string description = "Resource Group for this deployment." } variable "location" { type = string description = "Location for all resources" } variable "adminUsername" { type = string description = "Username for the Virtual Machine." } variable "adminPassword" { type = string description = "Password for the Virtual Machine." } variable "dnsLabelPrefix" { type = string description = "Unique DNS Name for the Public IP used to access the Virtual Machine." } variable "windowsOSVersion" { type = list default = ["2016-Datacenter","2008-R2-SP1","2012-Datacenter","2012-R2-Datacenter","2016-Nano-Server","2016-Datacenter-with-Containers","2019-Datacenter"] description = "The Windows version for the VM. This will pick a fully patched image of this given Windows version." } variable "vmSize" { type = string default = "Standard_A2_v2" description = "Size of the virtual machine." } main.tf This file contains the actual configuration to create a simple Windows VM… # Declaring the local variables locals { storageAccountName = lower(join("", ["sawinvm", random_string.asaname-01.result])) nicName = "myVMNic" addressPrefix = "10.0.0.0/16" subnetName = "Subnet" subnetPrefix = "10.0.0.0/24" publicIPAddressName = "myPublicIP" vmName = "SimpleWinVM" virtualNetworkName = "MyVNET" networkSecurityGroupName = "default-NSG" osDiskName = join("",[local.vmName, "_OsDisk_1_", lower(random_string.avmosd-01.result)]) } # Generating the random string to create a unique storage account resource "random_string" "asaname-01" { length = 16 special = "false" } # Generating the random string to create a unique os disk resource "random_string" "avmosd-01" { length = 32 special = "false" } # Resource Group resource "azurerm_resource_group" "arg-01" { name = var.resourceGroupName location = var.location } # Storage Account resource "azurerm_storage_account" "asa-01" { name = local.storageAccountName resource_group_name = azurerm_resource_group.arg-01.name location = azurerm_resource_group.arg-01.location account_replication_type = "LRS" account_tier = "Standard" } # Public IP resource "azurerm_public_ip" "apip-01" { name = local.publicIPAddressName resource_group_name = azurerm_resource_group.arg-01.name location = azurerm_resource_group.arg-01.location allocation_method = "Dynamic" domain_name_label = var.dnsLabelPrefix } # Network Security Group with allow RDP rule resource "azurerm_network_security_group" "ansg-01" { name = local.networkSecurityGroupName resource_group_name = azurerm_resource_group.arg-01.name location = azurerm_resource_group.arg-01.location security_rule { name = "default-allow-3389" priority = 1000 access = "Allow" direction = "Inbound" destination_port_range = 3389 protocol = "Tcp" source_port_range = "*" source_address_prefix = "*" destination_address_prefix = "*" } } # Virtual Network resource "azurerm_virtual_network" "avn-01" { name = local.virtualNetworkName resource_group_name = azurerm_resource_group.arg-01.name location = azurerm_resource_group.arg-01.location address_space = [local.addressPrefix] } # Subnet resource "azurerm_subnet" "as-01" { name = local.subnetName resource_group_name = azurerm_resource_group.arg-01.name virtual_network_name = azurerm_virtual_network.avn-01.name address_prefix = local.subnetPrefix } # Associate the subnet with NSG resource "azurerm_subnet_network_security_group_association" "asnsga-01" { subnet_id = azurerm_subnet.as-01.id network_security_group_id = azurerm_network_security_group.ansg-01.id } # Network Interface Card resource "azurerm_network_interface" "anic-01" { name = local.nicName resource_group_name = azurerm_resource_group.arg-01.name location = azurerm_resource_group.arg-01.location ip_configuration { name = "ipconfig1" private_ip_address_allocation = "Dynamic" public_ip_address_id = azurerm_public_ip.apip-01.id subnet_id = azurerm_subnet.as-01.id } } # Virtual Machine resource "azurerm_virtual_machine" "avm-01" { name = local.vmName resource_group_name = azurerm_resource_group.arg-01.name location = azurerm_resource_group.arg-01.location vm_size = var.vmSize network_interface_ids = [azurerm_network_interface.anic-01.id] os_profile { computer_name = local.vmName admin_username = var.adminUsername admin_password = var.adminPassword } storage_image_reference { publisher = "MicrosoftWindowsServer" offer = "WindowsServer" sku = var.windowsOSVersion[0] version = "latest" } storage_os_disk { name = local.osDiskName create_option = "FromImage" } storage_data_disk { name = "Data" disk_size_gb = 1023 lun = 0 create_option = "Empty" } os_profile_windows_config { provision_vm_agent = true } boot_diagnostics { enabled = true storage_uri = azurerm_storage_account.asa-01.primary_blob_endpoint } } # Print virtual machine dns name output "hostname" { value = azurerm_public_ip.apip-01.fqdn } Deploy the configuration Authentication Please use AzureCli to authenticate to your Azure cloud environment, terraform can use the same session to deploy the resources. Check here for alternate methods of authentication. Initialization First time you need to initialize the configuration directory, where it will download the necessary plugins to deploy the current configuration, to initialize you need to run the command below… terraform init Plan Terraform will check for the syntax errors and generates the execution plan, you can also save this plan for your future deployments terraform plan Apply To apply the configuration run the command below… terraform apply
As you might already know, Microsoft has released a new Azure PowerShell module Az to replace with AzureRM module in future. As of now both the versions are available for Windows PowerShell and PowerShellCore. But no further developments for AzureRM module except for bg fixes and all the updates and feature enchantments come along with the new modules Az itself. Just to start with the Az module, lets retrieve the list of Azure VM Images (skus) available in a given location from the mentioned publisher with the offerings… #requires -Module Az # Please connect to Azure using Connect-AzAccount # Get the complete list of Azure service locations Get-AzLocation <# Get-AzLocation | Where-Object -FilterScript {$_.Location -match 'india'} Location : southindia DisplayName : South India Providers : {Microsoft.Batch, Microsoft.ClassicCompute, Microsoft.ClassicNetwork, Microsoft.ClassicStorage...} Location : centralindia DisplayName : Central India Providers : {Microsoft.Automation, Microsoft.Batch, Microsoft.ClassicCompute, Microsoft.ClassicNetwork...} Location : westindia DisplayName : West India Providers : {Microsoft.ClassicCompute, Microsoft.ClassicNetwork, Microsoft.ClassicStorage, Microsoft.Compute...} #> # Select Location [string] $Location = 'South India' # Get the complete list of VM Image publishers Get-AzVMImagePublisher <# Get-AzVMImagePublisher -Location "South India" | Where-Object -FilterScript {$_.PublisherName -in ('MicrosoftWindowsServer','Canonical')} PublisherName Location Id ------------- -------- -- Canonical SouthIndia /Subscriptions/<subscription_id>/Providers/Microsoft.Compute/Locations/SouthIndia/Publishers/Canonical MicrosoftWindowsServer SouthIndia /Subscriptions/<subscription_id>/Providers/Microsoft.Compute/Locations/SouthIndia/Publishers/MicrosoftWindowsServer #> # Select Publisher [string] $Publisher = 'Canonical' # Get the list of offering from the publisher with in the location Get-AzVMImageOffer -Location $Location -PublisherName $Publisher <# Get-AzVMImageOffer -Location $Location -PublisherName $Publisher | Format-List * Offer PublisherName Location Id ----- ------------- -------- -- UbuntuServer Canonical SouthIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/SouthIndia/Publishers/Canonical/ArtifactTypes/VMImage/Offers/Ubun... Ubuntu_Core Canonical SouthIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/SouthIndia/Publishers/Canonical/ArtifactTypes/VMImage/Offers/Ubun... #> # Select the offering [string] $Offer = 'UbuntuServer' # Get the list of image skus available in the given location from the given publisher with the given offerings Get-AzVMImageSku -Location $Location -PublisherName $Publisher -Offer $Offer <# Get-AzVMImageSku -Location $Location -PublisherName $Publisher -Offer $Offer Skus Offer PublisherName Location Id ---- ----- ------------- -------- -- 12.04.5-LTS UbuntuServer Canonical SouthIndia /Subscriptions/<subscription_id>/Providers/Microsoft.Compute/Locations/SouthIndia/Publishers/Canonical/ArtifactTypes/V... 14.04.0-LTS UbuntuServer Canonical SouthIndia /Subscriptions/<subscription_id>/Providers/Microsoft.Compute/Locations/SouthIndia/Publishers/Canonical/ArtifactTypes/V... 14.04.1-LTS UbuntuServer Canonical SouthIndia /Subscriptions/<subscription_id>/Providers/Microsoft.Compute/Locations/SouthIndia/Publishers/Canonical/ArtifactTypes/V... 14.04.2-LTS UbuntuServer Canonical SouthIndia /Subscriptions/<subscription_id>/Providers/Microsoft.Compute/Locations/SouthIndia/Publishers/Canonical/ArtifactTypes/V... 14.04.3-LTS UbuntuServer Canonical SouthIndia /Subscriptions/<subscription_id>/Providers/Microsoft.Compute/Locations/SouthIndia/Publishers/Canonical/ArtifactTypes/V... 14.04.4-LTS UbuntuServer Canonical SouthIndia /Subscriptions/<subscription_id>/Providers/Microsoft.Compute/Locations/SouthIndia/Publishers/Canonical/ArtifactTypes/V... 14.04.5-DAILY-LTS UbuntuServer Canonical SouthIndia /Subscriptions/<subscription_id>/Providers/Microsoft.Compute/Locations/SouthIndia/Publishers/Canonical/ArtifactTypes/V... 14.04.5-LTS UbuntuServer Canonical SouthIndia /Subscriptions/<subscription_id>/Providers/Microsoft.Compute/Locations/SouthIndia/Publishers/Canonical/ArtifactTypes/V... 16.04-DAILY-LTS UbuntuServer Canonical SouthIndia /Subscriptions/<subscription_id>/Providers/Microsoft.Compute/Locations/SouthIndia/Publishers/Canonical/ArtifactTypes/V... 16.04-LTS UbuntuServer Canonical SouthIndia /Subscriptions/<subscription_id>/Providers/Microsoft.Compute/Locations/SouthIndia/Publishers/Canonical/ArtifactTypes/V... 16.04.0-LTS UbuntuServer Canonical SouthIndia /Subscriptions/<subscription_id>/Providers/Microsoft.Compute/Locations/SouthIndia/Publishers/Canonical/ArtifactTypes/V... 18.04-DAILY-LTS UbuntuServer Canonical SouthIndia /Subscriptions/<subscription_id>/Providers/Microsoft.Compute/Locations/SouthIndia/Publishers/Canonical/ArtifactTypes/V... 18.04-LTS UbuntuServer Canonical SouthIndia /Subscriptions/<subscription_id>/Providers/Microsoft.Compute/Locations/SouthIndia/Publishers/Canonical/ArtifactTypes/V... 18.10 UbuntuServer Canonical SouthIndia /Subscriptions/<subscription_id>/Providers/Microsoft.Compute/Locations/SouthIndia/Publishers/Canonical/ArtifactTypes/V... 18.10-DAILY UbuntuServer Canonical SouthIndia /Subscriptions/<subscription_id>/Providers/Microsoft.Compute/Locations/SouthIndia/Publishers/Canonical/ArtifactTypes/V... 19.04-DAILY UbuntuServer Canonical SouthIndia /Subscriptions/<subscription_id>/Providers/Microsoft.Compute/Locations/SouthIndia/Publishers/Canonical/ArtifactTypes/V... #> # For Windows Server $Location = 'Central India' $Publisher = 'MicrosoftWindowsServer' $Offer = 'WindowsServer' Get-AzVMImageSku -Location $Location -PublisherName $Publisher -Offer $Offer <# Get-AzVMImageSku -Location $Location -PublisherName $Publisher -Offer $Offer Skus Offer PublisherName Location Id ---- ----- ------------- -------- -- 2008-R2-SP1 WindowsServer MicrosoftWindowsServer CentralIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/Centra... 2008-R2-SP1-smalldisk WindowsServer MicrosoftWindowsServer CentralIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/Centra... 2008-R2-SP1-zhcn WindowsServer MicrosoftWindowsServer CentralIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/Centra... 2012-Datacenter WindowsServer MicrosoftWindowsServer CentralIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/Centra... 2012-Datacenter-smalldisk WindowsServer MicrosoftWindowsServer CentralIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/Centra... 2012-Datacenter-zhcn WindowsServer MicrosoftWindowsServer CentralIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/Centra... 2012-R2-Datacenter WindowsServer MicrosoftWindowsServer CentralIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/Centra... 2012-R2-Datacenter-smalldisk WindowsServer MicrosoftWindowsServer CentralIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/Centra... 2012-R2-Datacenter-zhcn WindowsServer MicrosoftWindowsServer CentralIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/Centra... 2016-Datacenter WindowsServer MicrosoftWindowsServer CentralIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/Centra... 2016-Datacenter-Server-Core WindowsServer MicrosoftWindowsServer CentralIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/Centra... 2016-Datacenter-Server-Core-smalldisk WindowsServer MicrosoftWindowsServer CentralIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/Centra... 2016-Datacenter-smalldisk WindowsServer MicrosoftWindowsServer CentralIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/Centra... 2016-Datacenter-with-Containers WindowsServer MicrosoftWindowsServer CentralIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/Centra... 2016-Datacenter-with-RDSH WindowsServer MicrosoftWindowsServer CentralIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/Centra... 2016-Datacenter-zhcn WindowsServer MicrosoftWindowsServer CentralIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/Centra... 2019-Datacenter WindowsServer MicrosoftWindowsServer CentralIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/Centra... 2019-Datacenter-Core WindowsServer MicrosoftWindowsServer CentralIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/Centra... 2019-Datacenter-Core-smalldisk WindowsServer MicrosoftWindowsServer CentralIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/Centra... 2019-Datacenter-Core-with-Containers WindowsServer MicrosoftWindowsServer CentralIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/Centra... 2019-Datacenter-Core-with-Containers-smalldisk WindowsServer MicrosoftWindowsServer CentralIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/Centra... 2019-Datacenter-smalldisk WindowsServer MicrosoftWindowsServer CentralIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/Centra... 2019-Datacenter-with-Containers WindowsServer MicrosoftWindowsServer CentralIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/Centra... 2019-Datacenter-with-Containers-smalldisk WindowsServer MicrosoftWindowsServer CentralIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/Centra... 2019-Datacenter-zhcn WindowsServer MicrosoftWindowsServer CentralIndia /Subscriptions/899041ff-5768-4e79-931b-a9e9e9bad5fd/Providers/Microsoft.Compute/Locations/Centra... #>
In scripting, there are many things to experience in many ways, the traditional way is always the best practice though, the formal way is always an option… Function Add { param ( [Parameter(Mandatory=$true)] [int] $Number1, [Parameter(Mandatory=$true)] [int] $Number2 ) [int] $Sum = 0 $Sum = $Number1 + $Number2 return, $Sum } Add -Number1 4 -Number2 5 Formal way $Add = {$args[0] + $args[1]} . $Add 4 5
SQL Server loves PowerShell, it makes SQL Server DBA life easy and simple. I have seen SQL Server automated with PowerShell to an extent where I stopped using SQL Server Management Studio (SSMS) ever since I started using PowerShell. Database Administrator doesn’t require SSMS all the time to connect to SQL Server if you are accompanying with PowerShell. There are quite a few tools are already available in the internet from dbatools.io, idera PowerShell scripts and etc., but every approach is unique. SQL Server PSObject, SQL Server functionalities within a single PowerShell object. PSObject, I love the most in PowerShell, you can customise the object of your own choice of properties & methods, and the usage is also as simple as just initiate the object and call the methods of your choice. I have created a new PSObject with ConnectSQL and ExecuteSQL methods, they are the very basic and predominant functionalities to work with SQL Server. # Create an object $SQLServerObject = New-Object -TypeName psobject And added few essential properties, mainly used to establish the connection to Sql Server… # Basic properties $SQLServerObject | Add-Member -MemberType NoteProperty -Name ServerName -Value 'SQLServer' # Server Name $SQLServerObject | Add-Member -MemberType NoteProperty -Name DefaultPort -Value 1433 # Port $SQLServerObject | Add-Member -MemberType NoteProperty -Name Database -Value 'master' # Database $SQLServerObject | Add-Member -MemberType NoteProperty -Name ConnectionTimeOut -Value 15 # Connection Timeout $SQLServerObject | Add-Member -MemberType NoteProperty -Name QueryTimeOut -Value 15 # Query Timeout $SQLServerObject | Add-Member -MemberType NoteProperty -Name SQLQuery -Value '' # SQL Query $SQLServerObject | Add-Member -MemberType NoteProperty -Name SQLConnection -Value '' # SQL Connection The properties like ServerName, Port, Database and ConnectionTimeout are must to define before you call either connect method or execute method, SQLConnection property holds the sql server connection object to execute the sql queries with execute method. SQLQuery property holds the query text to execute the query against the sql server mentioned in the ServerName property, you can also enter the query while calling the execute method. Ensure the Server is ping-able using the TestConnection method… # Method to ensure the server is pingable $SQLServerObject | Add-Member -MemberType ScriptMethod -Name TestConnection -Value { Test-Connection -ComputerName $this.ServerName -ErrorAction SilentlyContinue } Establish the connection and store the connection object in the SQLConnection property of the object. # Method to establish the connection to SQL Server and holds the connection object for further use $SQLServerObject | Add-Member -MemberType ScriptMethod -Name ConnectSQL -Value { [string] $ServerName= $this.ServerName [int] $Port = $this.DefaultPort [string] $Database = $this.Database [int] $TimeOut = $this.ConnectionTimeOut $SQLConnection = New-Object -TypeName System.Data.SqlClient.SqlConnection $SQLConnection.ConnectionString = "Server = $ServerName,$Port; Database = $Database; Integrated Security = True;Connection Timeout=$TimeOut;" $SQLConnection.Open() $this.SQLConnection = $SQLConnection } ExecuteSQL method to execute the queries using the connection established using the ConnectSQL method… # Execute SQL method to execute queries using the connection established with ConnectSQL $SQLServerObject | Add-Member -MemberType ScriptMethod -Name ExecuteSQL -Value { param ( [Parameter(Mandatory=$false)] [string] $QueryText ) # Select runtime query / predefined query [string] $SQLQuery = $this.SQLQuery if ([string]::IsNullOrEmpty($QueryText) -eq $false) { $SQLQuery = $QueryText } # Verify the query is not null and empty, then execute if ([string]::IsNullOrEmpty($SQLQuery)) { Write-Host "Please add query to this object or enter the query." -ForegroundColor Red } else { if ($this.SQLConnection.State -eq 'Open') { # SQL Command $SQLCommand = New-Object System.Data.SqlClient.SqlCommand $SQLCommand.CommandText = $SQLQuery $SQLCommand.CommandTimeout = $this.QueryTimeOut $SQLCommand.Connection = $this.SQLConnection # SQL Adapter $SQLAdapter = New-Object System.Data.SqlClient.SqlDataAdapter $SQLAdapter.SelectCommand = $SQLCommand # Dataset $DataSet = New-Object System.Data.Dataset $SQLAdapter.Fill($DataSet) | Out-Null return $DataSet.Tables[0] } else { Write-Host "No open connection found." -ForegroundColor Red } } } And finally return the object… # Return the object return, $SQLServerObject Now, lets see how we can connect to sql server and execute the sql queries… First, create an object… PS C:\GitRepo> $SQL = .\Create-SQLServerObject.ps1 PS C:\GitRepo> $SQL By default it takes the localhost name as servername, default sql server port and master database a default database to establish the connection. Assign a server name and test the connectivity… PS C:\GitRepo> $SQL.ServerName = 'SQLServer' PS C:\GitRepo> $SQL.TestConnection() If the server is accessible, then establish the connection to the SQL Server, if the sql server port is other than default port, then assign the port to the object… PS C:\GitRepo> $SQL.DefaultPort = 2866 # Just an example Establish the connection… PS C:\GitRepo> $SQL.ConnectSQL() PS C:\GitRepo> $SQL.SQLConnection Add query text to the object and call ExecuteSQL method… PS C:\GitRepo> $SQL.SQLQuery = "Select database_id,name from sys.databases" PS C:\GitRepo> $SQL.ExecuteSQL() You can also enter the query while calling the method itself… PS C:\GitRepo> $SQL.ExecuteSQL("Select @@Version as Version") You can add any number of methods of your choice and customise as per your requirements, you can also execute the *.sql files as well… The complete code is available in my git repository.
When PowerShell requires to execute the script with elevated permissions then add #requires statement in the beginning of the code… #requires -RunAsAdministrator Set-ExecutionPolicy -ExecutionPolicy Unrestricted For more details about #requires, run the help command below… Get-Help about_requires
I preferably use Visual Studio Code for my PowerShell scripting and it has the PowerShell extension along with intellisense and syntax highlighting, PowerShell integrated console as well, however I will use Windows PowerShell to execute the commands. VSCode has an option to show the PowerShell console on startup by default, and it annoys me a bit, so I decided to disable that option… Open VSCode, go to File -> Preferences -> Settings or click on Manage button on Side Bar then select Settings or simply use the shortcut Ctrl+, then search for PowerShell Integrated Console Show On Startup option in the settings and you will see the option below, just uncheck the option highlighted below to disable the console from your next start.
I don’t know for some reason Microsoft doesn’t provide some solutions directly, directly as in cannot achieve the outcome with a single command in PowerShell, for example to get the public ip address of an Azure VM, in fact I expected it to be as simple as this… Get-AzureRmPublicIpAddress -ResourceGroupName lab-rg -Name Workstation However there is no such command with those parameters, but still it’s not very complicated. I have written a small PowerShell wrapper to retrieve the public ip address of Azure VM and added few more functionalities as well apart from getting only the public ip address it will start the VM if it is not running by enabling the -StartIfVMIsNotRunning flag and connect to RDP session with -ConnectRDP flag. Note: Most of the organizations use either private ip or dns name to connect to the VM from their network, and this is only useful for small businesses or where there is no need of domain authentication and access from outside the local network. The script has two parameter sets Name and Object, which accepts ResourceGroupName and VMName or VMObject along with the other common parameters, and the parameters are as below… [Parameter(Mandatory=$true,ParameterSetName='Name')] [string] $ResourceGroupName, # ResourceGroup Name when the ParameterSetName is 'Name' [Parameter(Mandatory=$true,ParameterSetName='Name')] [string] $VMName, # Virtual Machine Name when the ParameterSetName is 'Name' [Parameter(Mandatory=$true,ValueFromPipeline=$true,ParameterSetName='Object')] [Microsoft.Azure.Commands.Compute.Models.PSVirtualMachine] $VMObject, # VM Object when the ParameterSetName is 'Object' [Parameter(Mandatory=$false,ParameterSetName='Name')] [Parameter(Mandatory=$false,ParameterSetName='Object')] [switch] $StartIfVMIsNotRunning, # Start the VM, if it is not running [Parameter(Mandatory=$false,ParameterSetName='Name')] [Parameter(Mandatory=$false,ParameterSetName='Object')] [switch] $ConnetRDP, # Connect Remote Desktop Session [Parameter(Mandatory=$true,ParameterSetName='Help')] [switch] $H # Get Help Since the latest AzureRM (6.13.1) PowerShell module has some significant changes in the outcome of some CmdLets, ensuring the latest module is loaded… # Ensure the 6.13.1 version AzureRM module is loaded, # because some commands output have been changed in this version [System.Version] $RequiredModuleVersion = '6.13.1' [System.Version] $ModuleVersion = (Get-Module -Name AzureRM).Version if ($ModuleVersion -lt $RequiredModuleVersion) { Write-Verbose -Message "Import latest AzureRM module" break } Login into Azure account, if not logged in already… # Login in into the Azure account, if it is not already logged in if([string]::IsNullOrEmpty($(Get-AzureRmContext))) { $null = Add-AzureRmAccount } Retrieve the VM running state and ensure it is running, if -StartIfVMIsNotRunning flag is enabled then the VM will be started if it is not running. If VM is not running and ‘PublicIPAllocationMethod’ is set to static then still public ip can be retrieved, but if it is dynamic then the VM should be in running state itself… # Retrieve the virtual machine running status try { if ($PSCmdlet.ParameterSetName -eq 'Name') { $VM = Get-AzureRmVM -ResourceGroupName $ResourceGroupName -Name $VMName -Status } elseif ($PSCmdlet.ParameterSetName -eq 'Object') { $VM = Get-AzureRmVM -ResourceGroupName $VMObject.ResourceGroupName -Name $VMObject.Name -Status } } catch { Write-Verbose -Message $_.Exception.Message break } # Check whether the vm PowerState is running $VMStatus = $VM.Statuses | Where-Object { $_.Code -match 'running' } if ([string]::IsNullOrEmpty($VMStatus)) { [bool] $ISVMRunning = $false } else { [bool] $ISVMRunning = $true } # If VM is not running and -StartIfVMIsNotRunning flag is enabled, then start the VM if ($ISVMRunning -eq $false -and $StartIfVMIsNotRunning -eq $true) { $null = Start-AzureRMVM -ResourceGroupName $VM.ResourceGroupName -Name $VM.Name $ISVmRunning = $true } Now retrieve the public ip address of an Azure VM… # Get Public IP address $VirtualMachine = Get-AzureRMVM -ResourceGroupName $VM.ResourceGroupName -Name $VM.Name $NICId = $VirtualMachine.NetworkProfile.NetworkInterfaces.id $NICResource = Get-AzureRmResource -ResourceId $NICId $PIPId = $NICResource.Properties.ipConfigurations.properties.publicIPAddress.id $PIPResource = Get-AzureRmResource -ResourceId $PIPId $PIP = $PIPResource.Properties.ipAddress Exit the script if the VM is not running and PublicIPAllocationMethod is Dynamic or public ip is not assigned… # Exit the script if the VM is not running and PublicIPAllocationMethod is Dynamic or public ip is not assigned [string] $PublicIPAllocationMethod = $PIPResource.Properties.publicIPAllocationMethod if ([string]::IsNullOrEmpty($PIP.IPAddressToString) -and $ISVMRunning -eq $false -and $PublicIPAllocationMethod -eq 'Dynamic') { Write-Verbose -Message $("Since {0} VM is not running and 'Public IP Allocation Method is Dynamic', unable to determine the Public IP.`nRun the command with -StartIfVMIsNotRunning flag" -f $VMName) break } elseif ([string]::IsNullOrEmpty($PIP.IPAddressToString) -and $ISVMRunning -eq $true) { Write-Verbose -Message $("No public ip id assigned to this {0} VM." -f $VMName) break } If the ‘-ConnectRDP’ flag is enabled then the remote desktop connection will be established (only when the default port for RDP is allowed in the inbound security rules) otherwise it simply returns the public ip address… # Connect the VM when -ConnectRDP flag is enabled and VM is running if ($ConnetRDP -and $ISVMRunning) { Invoke-Expression "mstsc.exe /v $($PIP.IPAddressToString)" break } # Just return the IP address when no flags are enabled return, $PIP.IPAddressToString And lets see some examples… .EXAMPLE C:\GitRepo> .\Get-ARMVMPIP.ps1 -ResourceGroupName lab-rg -Name Workstation xxx.xxx.xxx.xxx Returns the public ip address when the VM is running or the VM is deallocated but the publicIPAllocationMethod is set to 'Static'. .EXAMPLE C:\GitRepo> $VM = Get-AzureRmVM -ResourceGroupName lab-rg -Name Workstation C:\GitRepo> $VM | .\Get-ARMVMPIP.ps1 xxx.xxx.xxx.xxx Returns the public ip address when the VM is running or the VM is deallocated but the publicIPAllocationMethod is set to 'Static'. .EXAMPLE C:\GitRepo> .\Get-ARMVMPIP.ps1 -ResourceGroupName lab-rg -Name Workstation -StartIfVMIsNotRunning xxx.xxx.xxx.xxx Returns the public ip address when the VM is running or starts the VM if it is not running and returns the public ip. .EXAMPLE C:\GitRepo> .\Get-ARMVMPIP.ps1 -ResourceGroupName lab-rg -Name Workstation -ConnectRDP # Doesn't return any output simply connects to RDP session Connect to RDP session when the VM is running .EXAMPLE C:\GitRepo> .\Get-ARMVMPIP.ps1 -ResourceGroupName lab-rg -Name Workstation -ConnectRDP # Doesn't return any output simply connects to RDP session Connect to RDP session when the VM is running and if the VM is not running it will start and establish the RDP session. The complete code is available in my git repository.
Know your PowerShell Version PS C:\Users\kiran> $PSVersionTable.PSVersion Major Minor Build Revision ----- ----- ----- -------- 5 1 17763 134 PS C:\Users\kiran> (Get-Host).Version Major Minor Build Revision ----- ----- ----- -------- 5 1 17763 134 PS C:\Users\kiran> $Host.Version Major Minor Build Revision ----- ----- ----- -------- 5 1 17763 134
Hello there, I am glad that you are here! I am really excited to start this blog and is all about PowerShell along with other technical stuff. The posts I blog here are truly from my personal experience and for my future reference. So I strongly recommend before you try anything from this blog directly on your production environment, please do it on your test environment first. You’re always welcome to leave your comments, suggestions, and question about my blog posts on my contact page. Thank you & Keep visiting my blog! Kiran Patnayakuni.