All posts by Thiyagu

How to run PowerShell Script from a Batch File

What is a .bat file?

A batch file is a text file that the Windows cmd.exe command line processor executes as a batch job. It contains a series of line commands in plain text that are executed to perform various tasks, such as starting programs or running maintenance utilities within Windows.

You can also read  – How to create Basic Chart by reading excel using PowerShell and How to remove the Multiple bindings (URLs) from IIS using PowerShell script

Why call my PowerShell script from a batch file

You can’t double-click to run .PS1 files, but you can execute a .BAT file that way. So, we’ll write a batch file to call the PowerShell script from the command line. There are several good reasons for doing this:

  • Non-technical Users are often tackled with PowerShell.
  • When a user double-clicks on a PowerShell script file (*.ps1) in default it will open the file in the editor rather than executing it.
  • Many scripts may require admin privileges in order to run correctly and in this case, the user need-aware of how to run a PowerShell script as admin without going into a PowerShell console and it will be difficult to run this for Non-technical users.

So in this post, we are going to discuss how you can call the PowerShell script from a batch file.

STEP #1: Creating sample .ps1 file

Writing simple site validation PowerShell script and save as SiteValidationTestThroughBATFile.ps1

#######################################################################<br>
#Project : Creating Powershell Mulitple Reports in HTML with CSS Format<br>
#Developer : Thiyagu S (dotnet-helpers.com)<br>
#Tools : PowerShell 5.1.15063.1155<br>
#E-Mail : mail2thiyaguji@gmail.com<br>
######################################################################

$_URL = 'https://dotnet-helpers.com'
$request= [System.Net.WebRequest]::Create($_URL)
$response = $request.getResponse()
if ($response.StatusCode -eq "200"){
write-host "`nSite - $_URL is up (Return code: $($response.StatusCode) - $([int] $response.StatusCode)) `n" -ForegroundColor green 
}
else {
write-host "`n Site - $_URL is down `n" ` -ForegroundColor red
}

STEP #2: Creating a batch file with .bat extension

I had created the simple notepad file and saved it with the .bat extension as shown below. The below .bat file has created file name with ClickToValidationSite_EXE.bat

STEP #3: Calling .ps file inside the .bat file

Open the ClickToValidationSite_EXE.bat by right with the Edit option, it will open in the notepad. Here we going to call the PowerShell script which will validate and provide the status of the site.

In the first line, the @echo off commands are used to disable the echoing or prevents the echo on the screen.
PowerShell.exe called from any CMD window or batch file to launch PowerShell. -ExecutionPolicy Unrestricted
-Commend will bypass the execution policy so the script will run without any restriction.

@echo off

powershell.exe -ExecutionPolicy Unrestricted -Command ". 'C:\PowerShell\SiteValidation.ps1'"

TIMEOUT /T 10

OUTPUT: Run PowerShell Script from a Batch

 

How to pass values between Tasks in a Pipeline using task.setvariable Command

Problem ?

Azure DevOps pipeline is a set of Tasks which can perform a specific task and these tasks will run inside a Agent Machine (ie., Virtual Machine). While Task is executing, it will be allocated some resources and after the Task execution is complete, the allocated resources will be de-allocated. The entire allocation / de-allocation process repeats for other tasks available within the pipeline. It means the Task1 cannot directly communicate with Task2 or any other subsequent Tasks in the pipeline (pass values between Tasks) as their scope of execution is completely isolated though they get executed in the same Virtual Machine .

In this article, we are going to learn about the scenario where you can communicate between Tasks and pass values between Tasks in a Pipeline .

How to Pass Values between tasks ?

When you use PowerShell and Bash scripts in your pipelines, it’s often useful to be able to set variables that you can then use in future tasks. Newly set variables aren’t available in the same task. You’ll use the task.setvariable logging command to set variables in PowerShell and Bash scripts.

what is Task.setvariable?

Task.setvariable is a logging command can be used to create a variable that be used across tasks within the pipeline whether they are in same job of a stage or across stages. VSO stands for Visual Studio Online, which is part of Azure DevOps’ early roots

“##vso[task.setvariable variable=myStageVal;isOutput=true]this is a stage output variable”

Example:

- powershell: |
Write-Host "##vso[task.setvariable variable=myVar;]foo"

- bash: |
echo "##vso[task.setvariable variable=myVar;]foo"

SetVariable Properties

The task.setvariable command includes properties for setting a variable as secret, as an output variable, and as read only. The available properties include:

  • variable = variable name (Required)
  • Issecret = true make the variable as a Secret
  • isoutput = To use the variable in the next stage, set the isoutput property to true
  • isreadonly = When you set a variable as read only, it can’t be overwritten by downstream tasks. Set isreadonly to true

Share variables between Tasks within a Job

Let’s now create a new variable in Task1, assign some value to it and access the Variable in next Task. 

  • Create a variable named Token using the setvariable syntax, assign it some test value (eg – TestTokenValue)
  • Display the value of the Token variable in the next Task as shown in below (Task name ‘Stage1-Job1-Task2’).
stages:
- stage: Stage1
jobs:
- job: Job1
steps:
- task: PowerShell@2
displayName: 'Stage1-Job1-Task1'
inputs:
targetType: 'inline'
script: |
Write-Host "##vso[task.setvariable variable=token]TestTokenValue"
- task: PowerShell@2
displayName: 'Stage1-Job1-Task2'
inputs:
targetType: 'inline'
script: |
Write-Host "the Value of Token : $(token)"

Now, view the output of the variable of the Stage1-Job1-Task2 as shown below. Share variables between Tasks across the Jobs (of the same Stage)

Share variables between Tasks across the Jobs (of the same Stage)

As we discussed in SetVariable property section, We need to use the isOutput=true flag when you desire to use the variable in another Task located in another Job.

>pool:
name: devopsagent-w-pprd01

stages:
- stage: Stage1
jobs:
- job: Stage1_Job1
steps:
- task: PowerShell@2
name: 'Stage1_Job1_Task1'
inputs:
targetType: 'inline'
script: |
Write-Host "##vso[task.setvariable variable=token;isoutput=true;]TestTokenValue"

- job: Stage1_Job2
dependsOn: Stage1_Job1
variables:
- name: GetToken
value: $[dependencies.Stage1_Job1.outputs['Stage1_Job1_Task1.token']]
steps:
- task: PowerShell@2
displayName: 'Stag1-Job2-Task1'
inputs:
targetType: 'inline'
script: |
Write-Host "the Value of Token : $(GetToken)"
  1. Navigate to Stage1_Job1_Task1 and add isoutput = true flag to the Logging Command which let’s us to access the value outside the Job.
  2. The Job in which you want to access the variable must be dependent on the other Job which produces the output. Add dependsOn: Stage1_Job1 in the Stage1_Job2.
  3. In the Stage1_Job2, Create a new variable named GetToken and set it’s values to $[dependencies.Stage1_Job1.outputs[‘Stage1_Job1_Task1.token’]]. This will help to access the variable value which is available in another dependent job. You can’ access this expression directly in the script. It’s mandatory to map the expression into the value of another variable.
  4. Finally, access the new variable in your script.
  5. Once the isoutput=true is added, it’s important to access the variable by prefixing the Task name. Otherwise, it wouldn’t work.

OUTPUT:

Below code where the Job2 can access the output of Job1.

Share variables between Tasks across Stages

As per below code, I didn’t specify dependency (using dependsOn) between Stages as Stage1 and Stage2 are one after the other. In case if you would like to access Stage1’s variable in Stage3 then the Stage2 must depend on Stage1.

Accessing value of one stage from another we need to use stageDependencies attribute where in between jobs we are used dependencies as shown in above YAML.

pool:
name: devopsagent-w-pprd01

stages:
- stage: Stage1
jobs:
- job: Stage1_Job1
steps:
- task: PowerShell@2
name: 'Stage1_Job1_Task1'
inputs:
targetType: 'inline'
script: |
Write-Host "##vso[task.setvariable variable=token;isoutput=true;]TestTokenValue"
- stage: Stage2
jobs:
- job: Stage2_Job1
variables:
- name: getToken
value: $[stageDependencies.Stage1.Stage1_Job1.outputs['Stage1_Job1_Task1.token']]
steps:
- task: PowerShell@2
displayName: 'Stag1-Job2-Task1'
inputs:
targetType: 'inline'
script: |
Write-Host "the Value of Token from Stage2: $(getToken)"

OUTPUT:

Export CSV file with Array values using Powershell

One of the best and easiest ways to put data into an easy-to-read format is with a CSV (comma-separated values ) file. The CSV file will have a line of headers to indicate column name and subsequent values for each column. The structured data is required for positioning in the CSV file, to achieve the PowerShell has few option for structured data. Since a CSV file is just like a text file, it can loosely be created with Powershell’s Add-Content cmdlet. Here i had explained how to Export CSV file with Array with help of simple Add-Content cmdle.

We can use both Add-Content OR Export-Csv to create and write the CSV files, the best way is use Export-Csv (if we created a structured data/object) else we can use Add-Content cmdlet . The Add-Content does not negatively understand a CSV file, it would still be able to read one.

############################################################
#Project : Read Array values and generate CSV file
#Developer : Thiyagu S (dotnet-helpers.com)
#Tools : PowerShell 5.1.15063.1155 
#E-Mail : mail2thiyaguji@gmail.com 
############################################################
#Declared the File path location with the CSV format.
$FilePathLocation = "C:\PowerShell\EmployeeDetails.csv"
#Declared the Array
$Empdetails =@();
$empCount = 3;
#Declared the Column names
$columnName = '"EmpName","EMPID","EMPemail","EMPpin"'
#Add the column tile to the Excel
Add-Content -Path $FilePathLocation -Value $columnName
#Loop will get user input for 3 time
for ($loopindex=0; $loopindex -lt $empCount; $loopindex++)
{
#Getting the User Input from the console
$Empdetails += Read-Host " Enter EMPName,EMPID,EMPemail,EMPpin by comma separator"
}
#looping the Emp details to print in the excel cell
$Empdetails | foreach { Add-Content -Path $FilePathLocation -Value $_ }

Output: 

Export CSV file with Array

What do you think?

I hope you have an idea of  How To Read Array values and generate CSV file with PowerShell. I would like to have feedback from my posts readers. Your valuable feedback, question, or comments about this article are always welcome.

Rebuild index to reduce Fragmentation in SQL Server

Here we will learn how to identify and resolve by Rebuild index to reduce Fragmentation in SQL Server. Index fragmentation identification and index maintenance are important parts of the database maintenance task. Microsoft SQL Server keeps updating the index statistics with the Insert, Update or Delete activity over the table. The index fragmentation is the index performance value in percentage, which can be fetched by SQL Server DMV. According to the index performance value, users can take the indexes in maintenance by revising the fragmentation percentage with the help of Rebuild or Reorganize operation.

Introduction:

In SQL Server, both “rebuild” and “reorganize” refer to operations that can be performed on indexes to address fragmentation. However, they are distinct operations with different characteristics. Let’s explore the differences between rebuilding and reorganizing indexes:

Note: Optimize index is one of the maintenance activity to improve query performance and reduce resource consumption. Ensure you will plan to Performing the database index during the off business hours or less traffic hours (less request to database).

Advantages of Rebuild index to reduce Fragmentation:

  • Removes both internal and external fragmentation.
  • Reclaims unused space on data pages.
  • Updates statistics associated with the index.

Considerations:

  • Requires more system resources.
  • Locks the entire index during the rebuild process, potentially causing blocking.

How to find the Fragmentation?

Here we executed the SQL script for checking the fragmentation details for the specific database , where the result shown in the percentage.

## Script will give details of fragmentation in percentage 
Method 1:

DECLARE @cutoff_date DATETIME = DATEADD(day, -20, GETDATE()); SELECT OBJECT_NAME(ip.object_id) AS TableName,
i.name AS IndexName,
ip.avg_fragmentation_in_percent
FROM sys.dm_db_index_physical_stats(DB_ID(), NULL, NULL, NULL, NULL) ip
JOIN sys.indexes i
ON ip.object_id = i.object_id AND ip.index_id = i.index_id
JOIN sys.dm_db_index_usage_stats ius
ON ip.object_id = ius.object_id AND ip.index_id = ius.index_id

Method 2:

SELECT
DB_NAME() AS DBName
,OBJECT_NAME(ps.object_id) AS TableName
,i.name AS IndexName
,ips.index_type_desc
,ips.avg_fragmentation_in_percent
FROM sys.dm_db_partition_stats ps
INNER JOIN sys.indexes i
ON ps.object_id = i.object_id
AND ps.index_id = i.index_id
CROSS APPLY sys.dm_db_index_physical_stats(DB_ID(), ps.object_id, ps.index_id, null, 'LIMITED') ips
ORDER BY ips.avg_fragmentation_in_percent DESC

Rebuild index to reduce Fragmentation:

The REBUILD operation involves recreating the entire index. This process drops the existing index and builds a new one from scratch. During the rebuild, the index is effectively offline, and there can be a period of downtime where the index is not available for queries. In simple, REBUILD locks the table for the whole operation period (which may be hours and days if the table is large). The syntax for rebuilding an index is as follows:

Rebuilding Full Index on selected database:

After executing the Rebuild index on specific database, you can able to view the fragmentation is reduce as shown in the below image.

-- Rebuild ALL Indexes
-- This will rebuild all the indexes on all the tables in your database.

SET NOCOUNT ON
GO

DECLARE rebuildindexes CURSOR FOR
SELECT table_schema, table_name  
FROM information_schema.tables
	where TABLE_TYPE = 'BASE TABLE'
OPEN rebuildindexes

DECLARE @tableSchema NVARCHAR(128)
DECLARE @tableName NVARCHAR(128)
DECLARE @Statement NVARCHAR(300)

FETCH NEXT FROM rebuildindexes INTO @tableSchema, @tableName

WHILE (@@FETCH_STATUS = 0)
BEGIN
   SET @Statement = 'ALTER INDEX ALL ON '  + '[' + @tableSchema + ']' + '.' + '[' + @tableName + ']' + ' REBUILD'
   --PRINT @Statement 
   EXEC sp_executesql @Statement  
   FETCH NEXT FROM rebuildindexes INTO @tableSchema, @tableName
END

CLOSE rebuildindexes
DEALLOCATE rebuildindexes
GO
SET NOCOUNT OFF
GO

Summary: 

Index fragmentation occurs due to frequent INSERT, UPDATE, and DELETE operations in SQL Server, leading to degraded query performance. Regular index maintenance, including identifying and resolving fragmentation, is crucial for database optimization.

Key Points:

  • Fragmentation Identification:
    • Use DMVs (Dynamic Management Views) like sys.dm_db_index_physical_stats to check fragmentation percentage.
    • Two methods are provided to analyze fragmentation levels across indexes.
  • Rebuild vs. Reorganize:
    • Rebuild: Drops and recreates the index entirely, removing internal and external fragmentation, reclaiming space, and updating statistics. However, it locks the index and consumes more resources.
    • Reorganize: Defragments the index without rebuilding it, suitable for low to moderate fragmentation.
  • When to Rebuild Indexes:
    • Recommended for high fragmentation (typically above 30%).
    • Should be scheduled during off-peak hours to minimize blocking.
  • How to Rebuild Indexes:
    • A script is provided to rebuild all indexes in a database dynamically.
    • Rebuilding reduces fragmentation, improving query performance and resource efficiency.

By regularly monitoring and rebuilding fragmented indexes, database administrators can maintain optimal SQL Server performance.

How to Continue Azure Pipeline on failed task

Introduction

Sometimes failing scripts are not failing the task when they should. And sometimes a failing command should not fail the task. How to handle these situations by Continue Azure Pipeline on failed task?

In Sometimes, you many have some Pipeline Tasks which may be dependent on external reference which may chance to fail at any time. In these scenarios, if the Task is failing (or may be failing intermittently) due to any issue in external reference, your entire pipeline would fail. You don’t have any insights about when the bug would get fixed. So in the above case, you though to run the pipeline if any issue in the that task and you want to ensure if any issue in future for this task then it wont lead to pipeline failure.

This simple technique can be used in scenarios where you have a non-mandatory task that’s failing intermittently and you want to continue the execution of the pipeline

Solution : 

In this case, it absolutely makes sense to continue the execution of the next set of Tasks (Continue Azure Pipeline on failed task) . In this post, we are going to learn how to continue the execution of the pipeline if a particular Task has failed by using ContinueOnError/failOnStderr/failOnStandardError property.

using ContinueOnError attribute in Powershell/script Task

Let’s build a pipeline with few tasks where we simulate and error in one of the tasks as shown below. As shown in the below code, the following points to be noted.

The task named “continueOnError Task”, an intentional added targetType: ‘filePath’ but used the ‘inline’ script instead of mapping with script file to simulate an error. Second, attribute called continueOnError has bee added to ignore if there any errors while executing the pipeline.

steps:
- task: Powershell@2
displayName: "continueOnError Task"
continueOnError: true
inputs:
ScriptType: InlineScript
Inline: |
Write-Hosts "Continue if any issue here"

- task: Powershell@2
displayName: "No Error Task"
continueOnError: true
inputs:
targetType: 'inline'
Inline: |
Write-Host "Block if any issue here"

Now, when you run the pipeline, an indication about the error for the Task is shown and the execution will carry forward as shown below.

 

Like PowerShell task, you can continue error in other tasks as well as shown in bellow (click below link reference to know about other tasks).

Summary:

In this this post, we have learnt how to continue the execution of the pipeline in spite-of having an error in one of the tasks that is Continue Azure Pipeline on failed task. This simple technique can be used in scenarios where you have a non-mandatory task that’s failing intermittently and you want to continue the execution of the pipeline.

How to Enable additional logs in Azure pipeline execution

One of the most important aspects in the Azure DevOps life cycle Pipeline development is to have tools and techniques in place to find out the root cause of any error that would have occurred during the pipeline Azure DevOps pipeline execution. In this article, we will learn how to review the logs that helps in troubleshoot for any errors in the Pipelines by enable additional logs in Azure pipeline execution

By default, Azure DevOps pipeline provides logs which provide information about the execution of each step in the pipeline. In case of any error or need to more information to debug that time the default logs wouldn’t help you in understanding what went wrong in the pipeline execution. In those cases, it would be helpful if we get more diagnostic logs about each step in the Azure DevOps pipelines.

Below are the two different techniques to enable the feature of getting additional logs.

Enable System Diagnostics logs for specific execution of pipeline.

If you would like to get additional logs for a specific pipeline execution then you need to do is enable the Enable System Diagnostics checkbox as shown below image and click on the Run button.

Enable System Diagnostics logs for All execution of pipeline.

If you always want to enable System Diagnostics to capture the Diagnostics trace for cases where the pipeline executed automatically (Continuous Integration scenarios), then you need to create a variable in the pipeline with the name system.debug and the set the value to true as shown below.

System.debug = true (variable helps us to generate diagnostics logs in during the execution of pipelines)
System.debug = False (variable helps us to not generate diagnostics logs in during the execution of pipelines)

Once we set the value of system.debug to true in our variable group (which referred in our pipeline), then the pipeline starts showing additional logs in the purple color as shown below.

System.debug = true

System.debug = false

Note: You like to view the logs without any colors, you can click on the View raw log button which opens up the logs in a separate browser window and you can save if required.

Download Logs from Azure pipeline:

In case if you like to share the logs to some other teams who doesn’t have access to your pipeline, you can download the logs by clicking on the Download Logs button in the Pipeline Summary page as shown below and you can share.

 

Understanding the directory structure created by Azure DevOps tasks

If you are beginner in Azure DevOps, understanding of when & which folders are created and populated by the pipeline tasks, this is one of the first step in learning Understanding the directory structure created by Azure DevOps tasks.

Azure DevOps Agent will supports Windows,Linux / Ubuntu, macOS Operating Systems but in this post we are going to check from the windows Agent machine.Let’s try to understand the folder structure by creating a very simple YAML based Azure DevOps Pipeline and add the Tasks based on the below instructions.

Let we understand the directory structure created by Azure DevOps tasks here !

Let we list all the folders that are created during the pipeline execution. It’s called as Workspace and so the local folder path can be referred using a Pre-defined variable called $(Pipeline.Workspace)

We are going to list all the folders (using below YAML with PowerShell task) that are created for a given current pipeline and this is called as Workspace. And the local folder path can be referred using a Pre-defined variable called $(Pipeline.Workspace)

In the below YAML, we added the powershell task to prints the folder structure inside the $(Pipeline.Workspace)

- task: PowerShell@2
  displayName: Show all Folders in $(Pipeline.Workspace) 
  inputs:
    targetType: 'inline'
    pwsh: true
    script: |
      Get-ChildItem -path $(Pipeline.Workspace)

Once your pipeline is executed, you will be able to see the folders like a, b, s and Test Results available in the workspace as shown in below snap shot.

Now based on the above image, Let’s now understand these folders and its usages in detail.

Folder Name: a
Referred using: $(Build.ArtifactStagingDirectory)/ $(Build.StagingDirectory)/ $(System.ArtifactsDirectory)

Artifact Staging Directory is a pre-defined variable and used in Build Pipelines for storing the artifacts of solution which get build (simply its output artifacts). if you confused, in simple it is output of the Build process for any type of solution ( like .net, java, python etc) or it could be as simple as copy files.

The publish build artifacts task creates an artifact of whatever is in this folder. This folder will get cleared/purged before each new build.

Folder Name: b
Referred using: $(Build.BinariesDirectory)

Binaries directory is a pre-defined variable used for storing the output of compiled binaries that are created as part of compilation process

Folder Name: s
Referred using: $(System.DefaultWorkingDirectory) / $(Build.SourcesDirectory)

Default working directory is a pre-defined variable that is mostly used to store the source code of the application. $(System.DefaultWorkingDirectory) is used automatically by the checkout step which download the code automatically as shown in below snap shot. In simple, this is the working directory and where your source code is stored.

Folder Name: TestResults
Referred using: $(Common.TestResultsDirectory)

Test results Directory contains the local directory of the agent which could be used for storing the Test Results.

Summary of directory structure created by Azure DevOps tasks

[wpdatatable id=2]

Variable Substitution in Config using YAML DevOps pipeline

As DevOps Engineer, you are responsible for to develop the Azure DevOps pipelines which should replace these values (DEV/TEST/PREPROD/PROD) based on the environment. However, the configuration values could change across environments. In this article, we are going to learn how to dynamically change the environment specific values (Variable Substitution) in the Azure DevOps Pipelines using an Azure DevOps Extension called Replace Tokens.

In my previous article, we had discussed about DevOps guys need to ensure all the secrets need to be kept inside the Key vault instead of using directly from the Azure DevOps Variable group. But all the project decision will not be same, still many project using the variable group for maintaining the secrets and lock them. This article is focusing on same and will explain how to Manage environment specific configurations (Variable Substitution using Replace Tokens – Azure DevOps Marketplace extension

Example Config File

Below is the sample config file which we are going to use for variable substitution from Key Vault in YAML Azure DevOps pipelines

These Configuration values must be environment specific and they have different values in different environments. DevOps engineers would have to develop the Azure DevOps pipelines which should replace these values in my below config. In my case the smtp-host, smtp-username, smtp-password are different for lower environment (dev/qa & preprod) and higher environment.

How to use Replace Tokens Extension in Azure YAML pipeline

Here we are going to use Replace Tokens task to replace tokens in config files with variable values.

The parameters of the task are described bellow, in parenthesis is the YAML name:

  • Root directory (rootDirectory): the base directory for searching files. If not specified the default working directory will be used. Default is empty string
  • Target files (targetFiles): the absolute or relative newline-separated paths to the files to replace tokens. Wildcards can be used (eg: **\*.config for all .config files in all sub folders). Default is **/*.config
  • Token prefix (tokenPrefix): when using custom token pattern, the prefix of the tokens to search in the target files. Default is #{
  • Token suffix (tokenSuffix): when using custom token pattern, the suffix of the tokens to search in the target files. Default is }#

Example 1:  Replace with Target files parameter

- task: replacetokens@5
displayName: 'replacing token in new config file'
inputs:
targetFiles: |
src/Feature/dotnethelpers.Feature.Common.General.config
src/Foundation/dotnethelpers.Foundation.SMTP.config
encoding: 'auto'
writeBOM: true
actionOnMissing: 'warn'
keepToken: false
tokenPrefix: '{'
tokenSuffix: '}'
useLegacyPattern: false
enableTelemetry: true

Note: the task will only work on text files, if you need to replace token in archive file you will need to first extract the files and after archive them back.

Example 2:  Replace with Root directory and Target files parameter

You can able to see, we can also give the targe files with comma separate as well like below YAML.

- task: replacetokens@5
inputs:
rootDirectory: 'src/Feature/Forms/code/App_Config/Include/Feature/'
targetFiles: 'dotnethelpers.Feature.config,dotnethelpers.Foundation.SMTP.config'
encoding: 'auto'
tokenPattern: 'default'
writeBOM: true
actionOnMissing: 'warn'
keepToken: false
actionOnNoFiles: 'continue'
enableTransforms: false
enableRecursion: false
useLegacyPattern: false
enableTelemetry: true

Example 3:  Replace with wildcard

Target files (targetFiles): the absolute or relative newline-separated paths to the files to replace tokens. Wildcards can be used (eg: **\*.config for all .config files in all sub folders). Default is **/*.config.

As per below targetfiles path, replace token will search all the .config files inside the node folder and replace the token if applicable.

- task: replacetokens@5
displayName: replacing token
inputs:
targetFiles: |
node/*.config
encoding: 'auto'
writeBOM: true
actionOnMissing: 'warn'
keepToken: false
tokenPrefix: '#{'
tokenSuffix: '}#'
useLegacyPattern: false
enableTelemetry: true

Sample Output: Variable Substitution

Add Tags On An AZURE SQL DATABASE Using PowerShell

As as system admin/DevOps guys, usually during audit time we need to complete the finding (by make the grouping to filter) in faster manner instead of doing as manual work, like similar we got the review command that all the resources need to be have the tags. But completing this task will have huge manual effort as we maintaining lot of resources so thought to make automation to complete this task. Here we going to discuss bout how to Add Tags On An AZURE SQL DATABASE Using PowerShell

What Is A Tag In Azure?

Its Creates a predefined Azure tag or adds values to an existing tag | Creates or updates the entire set of tags on a resource or subscription.

Azure tagging is an excellent feature from Microsoft that was help you to logically group your Azure resources and help you to track your resources. It also helps to automate the deployments of the resources and another important feature is this feature helps to provide the visibility of the resource costs that they are liable for.

Syntax: New-AzTag [-ResourceId] <String> [-Tag] <Hashtable> [-DefaultProfile <IAzureContextContainer>] [-WhatIf] [-Confirm] [<CommonParameters>]


What are the ways to create a Tags in Azure?

The Azure Tag enables the key-value pairs to be created and assigned to the resources in Azure using Azure Portal, PowerShell. Azure CLI, etc.

Note: Tag names and Tag Values are case-sensitive in nature.

Why Use The Azure tag ?

The main intention to use the Azure tags is to organize the resources in Azure Portal. When you are organizing the resources properly helps you to identify the category that the resources belong to. So basically, Azure tags are the name-value pairs that help to organize the Azure resources in the Azure Portal

For example (consider you having the tag), When you have more resources in your Azure Portal, that time it really helps you to categorize your Azure Resources. Suppose you have 6 Virtual machines (VM) in your Azure Subscription, among them 2 are under our development environment and 2 are for the QA environment and remain 2 belong to our Production environment. So we can tag them as Environment = Development, Environment = QA, or Environment = Production. so now we can easily get the view of each resource coming under the specific environment.


How To Create Azure Tags Using powershell


Step 1: Connect to your Azure account using Connect-AzAccount

Before starting please gather secret for service principal , AppId , tenantId to establish the connect to the azure portal to perform the operation against the Azure services.

#Converts plain text or encrypted strings to secure strings.
$SecureServicePrinciple = ConvertTo-SecureString -String "rfthr~SSDCDFSDFE53Lr3Daz95WF343jXBAtXADSsdfEED" -AsPlainText -Force
#Assigning the App ID
$AppId = "0ee7e633-0c49-408e-b956-36d62264f644"
#Assigning the Tenant ID
$tenantId= "32cf8ba2-403a-234b-a3b9-63c2f8311778"
#storing a username and password combination securely.
$pscredential = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $AppId, $SecureServicePrinciple
#Connect to Azure with an authenticated account for use with cmdlets from the Az PowerShell modules.
Connect-AzAccount -ServicePrincipal -Credential $pscredential -Tenant $tenantId

Step 2: Define the tags and assign as a array list

Based on my requirement i had added the below tags, you can create N number of tags based on your segregation of the resources.

$tags = @{"Business Unit"="WX_Digital"; "Environment"="PROD"; "Owner"="dotnet-helpers" ; "Project"="webapp"}


Step 3: Getting all the SQL database in the specific resources group

Example 1: Get single SQL database to update

  • Get-AzResource cmdlet will Gets All the resources from the specific subscription and filtered with Resource Type SQL.
  • After executing below script the RESOURCE.id variable will have all the databases names inside the specific resource type as shown in the below snap shot.
  • -ResourceType : The resource type of the resource(s) to be retrieved. For example, Microsoft.Compute/virtualMachines
#GET the single database by where condition
$RESOURCE = Get-AzResource -ResourceGroupName "rg-dgtl-pprd" -ResourceType "Microsoft.Sql/servers/databases" | Where-Object name -Like 'sqlsrvr-dgtl-pprd/sitecore_master' 

$resourceIds = $RESOURCE.Id

Example 2: Get all the database to update

#Gets All the database 
$RESOURCE = Get-AzResource -ResourceGroupName "rg-dgtl-pprd" -ResourceType "Microsoft.Sql/servers/databases" 

$resourceIds = $RESOURCE.Id

Step 4: Update new tags using new-AzTag command

We can Creates a predefined Azure tag or adds values to an existing tag using new-AzTag cmdlet | Creates or updates the entire set of tags on a resource or subscription.

-ResourceId : The resource identifier for the entity being tagged. A resource, a resource group or a subscription may be tagged.

Example: 1 Add tags to single database

#Creates or updates the entire set of tags on a resource or subscription.
#The resource identifier for the entity being tagged. A resource, a resource group or a subscription may be tagged.
new-AzTag -ResourceId $resourceId -Tag $tags

Example 2: Add tags to all database under ResourceGroup

foreach($resourceId in $resourceIds){

Write-Output $resourceId
#Creates or updates the entire set of tags on a resource or subscription.
new-AzTag -ResourceId $resourceId -Tag $tags

}

Output:

Note: 

  • Get-AzResource cmdlet will Gets All the resources from the specific subscription and filtered with Resource Type SQL.
  • -ResourceType : The resource type of the resource(s) to be retrieved. For example, Microsoft.Compute/virtualMachines