Automating Exchange Online using PowerShell and Github Actions with modern authentication

5 minute read

I know this is probably a niche use case, but being able to run PowerShell scripts to talk to Exchange is something that I spent some time devising and have gotten a lot of mileage out of the capability. So if you need to be able to run a PowerShell script that talks to Exchange in a Github workflow, here’s how you set up the access.

I’ve got examples of how I’ve used this at the bottom.


I know certificates are usually complicated to deal with, but in this case its our only option since basic auth is being retired. Fortunately the setup process has been thoroughly documented by Microsoft. So I’ll pause while you run through the five steps listed there and I’ll list them here for reference:

  1. Register the application in Azure AD
  2. Assign API permissions to the application
  3. Generate a self-signed certificate
  4. Attach the certificate to the Azure AD application
  5. Assign Azure AD Roles to the application

Setup Github

Ok, so you should now have 3 things:

  • The Azure AD App ID (created in step 1)
  • The certificate in pfx format (created in step 3)
  • The secret to unlock the cert’s private key (also created it step 3, but hopefully using a unique password)

Prepare the Certificate

Github doesn’t allow us to store the certificate as a repository secret and we definitely shouldn’t store it in the repository. So what we can do instead is to convert the certificate to a byte array and serialize that as JSON which we can store as a string in a Github secret.

Here’s what I mean:

$certPath = 'C:\tmp\cert.pfx'
# Get the content of the file as bytes
$byteCert = Get-Content $certPath -AsByteStream
# Convert the bytes to int and then json
$byteCert | %{[int]$_} | ConvertTo-Json -Compress
# Copy the json string to your clipboard
$jsonCert | clip

Now you can store the certificate as a secret in your repo!

Save the repository secrets

In my example, I’m going to use the following secret names for my secrets:

  • JSON_CERT: My certificate in JSON format
  • CERT_SECRET: My certificate’s password
  • AAD_APP_ID: My Azure AD Application ID

Set up the script

Now that we have all of our secrets in place, lets get the script going so that we can authenticate to Exchange Online.

Calling the Script

I prefer to pass the secrets as parameter to my scripts, so I’ll start my script out with:

param (

And then call the script from my yaml with:

      - name: Run script
        shell: pwsh
        run: .\script.ps1 -JSON_CERT '$' -CERT_SECRET '$' -AAD_APP_ID '$'

Loading the Certificate

To load the certificate, I have to load up the JSON array and then write it to a local file. Since we have a byte array, we can use [IO.File]::WriteAllBytes to do so:

$certPath = "$PSScriptRoot\cert.pfx"
[IO.File]::WriteAllBytes($certPath,($JSON_CERT | ConvertFrom-Json | %{[byte]$_}))

Authenticate to Exchange

Now for the good part, we’ll use our secrets to authenticate to Exchange!

Install-Module ExchangeOnlineManagement -Confirm:$false -Force

$exoSplat = @{
    CertificateFilePath = $certPath
    CertificatePassword = (ConvertTo-SecureString $CERT_SECRET -AsPlainText -Force)
    AppId = $AAD_APP_ID
    Organization = '<your tenant name i.e.>'
Connect-ExchangeOnline @exoSplat

Script away!

And now you should be off to the races with your script.

Here are some examples that I’ve previously set up as scripts in Github Actions

Use an IAC approach to Dynamic Groups

Lets say that you want a dynamic group for each department. You can maintain a list of departments in a JSON array and, whenever you push a change to that file, have a script in a workflow that loads that list and ensures a dynamic group exists with the proper name and filter for each department.

Service account management

I worked for a company that maintained a unique service account per location for their MFPs to be able to scan to email. A handful of locations isn’t a big deal, but we were dealing with hundreds. So I wrote a script that would take a JSON array of site names and ensure that the service account existed, that it had an Exchange license assigned, and had SMTP auth enabled. The script would run any time a change was pushed to the JSON file on the main branch.


In a situation where you have a widely dispersed team, having a central place to run reporting scripts is great. I’ve set up a repository dedicated to just running Exchange reports and, depending on the need, you can post the changes to a Teams channel, email, or just upload them to OneDrive. Lots of good options.

Leave a Comment