Using Shared Access Signature (SAS) tokens with azcopy is common — but rotating tokens and handling them securely can be a hassle. To improve security and simplify our automation, I recently replaced SAS-based authentication in our scheduled AzCopy jobs with Azure User Assigned Managed Identity (UAMI).
In this post, I’ll walk through how to:
Replace AzCopy SAS tokens with managed identity authentication
Assign the right roles to the UAMI
Use azcopy login to authenticate non-interactively
Storing SQL usernames and passwords in application configuration files is still common practice — but it poses a significant security risk. As part of improving our cloud security posture, I recently completed a project to eliminate plain text credentials from our app connection strings by switching to Azure User Assigned Managed Identity (UAMI) authentication for our SQL Managed Instance.
In this post, I’ll walk through how to:
Securely connect to Azure SQL Managed Instance without using usernames or passwords
Use a User Assigned Managed Identity (UAMI) for authentication
Test this connection using the new Go-based sqlcmd CLI
Update real application code to remove SQL credentials
Over the course of this blog series, we've successfully completed the Cloud Resume Challenge using Terraform as our infrastructure-as-code tool. Let's recap what we've accomplished:
Set up our development environment with Terraform and AWS credentials
Deployed a static website using S3, CloudFront, Route 53, and ACM
Built a serverless backend API with API Gateway, Lambda, and DynamoDB
Implemented CI/CD pipelines with GitHub Actions for automated deployments
Added security enhancements like OIDC authentication and least-privilege IAM policies
The final architecture we've created looks like this:
The most valuable aspect of this project is that we've built a completely automated, production-quality cloud solution. Every component is defined as code, enabling us to track changes, rollback if needed, and redeploy the entire infrastructure with minimal effort.
In our previous posts, we built the frontend and backend components of our cloud resume project. Now it's time to take our implementation to the next level by implementing continuous integration and deployment (CI/CD) with GitHub Actions.
In our previous posts, we set up the frontend infrastructure for our resume website using Terraform. Now it's time to build the backend API that will power our visitor counter.
In the previous post, we set up our Terraform environment and outlined the architecture for our Cloud Resume Challenge project. Now it's time to start building! In this post, we'll focus on deploying the first component: the static website that will host our resume.
The Cloud Resume Challenge is a hands-on project designed to build a real-world cloud application while showcasing your skills in AWS, serverless architecture, and automation. Many implementations of this challenge use AWS SAM or manual setup via the AWS console, but in this series, I will demonstrate how to build the entire infrastructure using Terraform. 💡
When I first discovered the Cloud Resume Challenge, I was immediately intrigued by the hands-on approach to learning cloud technologies. Having some experience with traditional IT but wanting to transition to a more cloud-focused role, I saw this challenge as the perfect opportunity to showcase my skills.
I chose Terraform over AWS SAM or CloudFormation because:
Multi-cloud flexibility - While this challenge focuses on AWS, Terraform skills transfer to Azure, GCP, and other providers
Declarative approach - I find the HCL syntax more intuitive than YAML for defining infrastructure
Industry adoption - In my research, I found that Terraform was highly sought after in job postings
Strong community - The extensive module registry and community support made learning easier
This series reflects my personal journey through the challenge, including the obstacles I overcame and the lessons I learned along the way.
✅ Set up Application Insights on an IIS-based web farm.
✅ Configure Log Analytics, Data Collection Rules, and Data Collection Endpoints.
✅ Use PowerShell to install the Application Insights agent.
✅ Monitor live metrics, failures, performance, and logs in real-time.
By the end, you'll have a fully monitored IIS-based web farm using Azure! 🎯
Managing software across an enterprise can be a headache, especially when it comes to removing outdated applications. Recently, I needed to uninstall the PaperCut MF Client from multiple Windows PCs in my environment. The challenge? Ensuring a clean removal without user intervention and no leftover files.
Rather than relying on manual uninstallation, we used Microsoft Intune to deploy a PowerShell script that handles the removal automatically. This blog post details the full process, from script development to deployment and testing.
Using the method detailed in this post, I successfully passed the AZ-400 exam while creating a reusable study system. This approach helped me transform 34+ hours of MSLearn content into structured, searchable revision notes that I could quickly reference during my exam preparation.
Let me walk you through how I developed this system and how you can apply it to your own certification journey.