Reusable PowerShell Modules with Pester: A Complete Guide

Last update: 17/12/2025
Author Isaac
  • Convert functions into modules PowerShell The manifest allows them to be easily reused, versioned, and distributed.
  • Pester provides unit and integration testing to validate modules and scripts, preventing regressions in future changes.
  • Integrating PowerShell, Pester, and PowerShellGet into CI/CD pipelines automates testing, linting, and publishing to the PowerShell Gallery.

reusable PowerShell modules with Pester

If you work with PowerShell daily, sooner or later you'll get tired of constantly copying and pasting the same functions. That's when it really makes sense to talk about... Reusable PowerShell modules and how to seriously test them with Pesterjust as you would with any "serious" development project.

Although PowerShell was originally developed as an administrator tool, today it is a A complete, cross-platform automation language with a powerful ecosystemModules, advanced scripting, automated testing, CI/CD... and if you know how to package your functions into well-built modules covered with Pester, you can version, share, publish to PowerShell Gallery and deploy with confidence without breaking a sweat every time you change a line.

What exactly is a PowerShell module and how does it differ from a script?

Basic concepts of PowerShell modules

A PowerShell module is, simply put, a package with commands Reusable: functions, cmdlets, aliases, variables, classes, or DSC resourcesWhat makes a module special is that PowerShell can load and unload it, control what is exposed to the user, assign it a version, author, dependencies, etc.

From a file perspective, a typical module consists of at least a .psm1 file (code) and, almost always, a .psd1 manifest with metadataIt can also include other scripts, resources, help files, data, etc. When importing with Import-Module or thanks to automatic loading, your commands become available as if they were native.

Instead, a script PowerShell is just a .ps1 file that runs “all at once”It can contain functions, but those functions normally reside within the script scope. If you don't use dot-sources or put them in a module, they disappear as soon as execution ends.

The practical difference is significant: scripts are usually “tasks that are launched”, while the modules are “reusable libraries” which you can consume from other scripts, from the console, or even from CI/CD pipelines without duplicating code.

Scripts vs modules: when to use each one

Differences between PowerShell scripts and modules

Un script (.ps1) It's ideal when you have to solve specific tasks: a backupA report, a scheduled cleaning… You run it, it does its job, and it ends. If you define functions within that script, those functions live in the script scopeWhen it's over, the scope is discarded and with it your functions.

Un script module (.psm1)Instead, it's designed to encapsulate functions that you'll reuse many times, on different machines, and even in different projects. By storing them in a module, They are loaded within a module scope, and you can export only what you need as a public API..

Many admins start with a giant script that does everything, and with ThereThey see that they constantly share parts of that code. The natural next step is extract those repeated functions into a module, leave them tested with Pester and let that module be the base consumed by other smaller and cleaner scripts.

Furthermore, the modules enable something key in equipment: versioning, publishing and distributionYou can install modules from PowerShell Gallery, from private repositories, from local repositories, or even embed them in system images so they are available by default.

Dot-sourcing: the “quick trick” before moving to modules

Before you have your functions in a module, you might have them in a simple script .ps1 that you want to reuseIn that case, the only decent way to have them available in your session is to use dot-sourcing.

When you run a script with . MyScript.ps1The content is processed within the script scope. If there are functions in that file, they disappear when the script finishes. However, if you invoke it with a dot and space in front (dot-sourcing)PowerShell executes it in the global scope and the functions are loaded:

Examples of dot-sourcing to load functions into memory:

  • Load with relative path: . .\Get-MrPSVersion.ps1
  • Load with full route: . C:\Demo\Get-MrPSVersion.ps1
  • Load using a path variable: $Path = 'C:\'; . $Path\Get-MrPSVersion.ps1

After dot-sourcing, you can verify that the function exists in the unit Function: with a Get-ChildItem -Path Function:\Get-MrPSVersion, in as much as now it is part of the global sphere.

The problem is that this approach scales poorly: We need to remember to do dot-sourcing; you have no version, no manifesto, no distinction between public and privateIt's a useful shortcut, but for something serious and shareable, it's much better to make the leap to script modules.

Creation of script modules and automatic loading

creation of script modules and automatic loading

In PowerShell, a script module is straightforward: It's a .psm1 file that contains your functionsIf, for example, you create a file MyScriptModule.psm1 with something like:

function Get-MrPSVersion { $PSVersionTable }
function Get-MrComputerName { $env:COMPUTERNAME }

and then you try to use Get-MrComputerName, you will see that It is not found until you import the moduleYou can do it manually with:

  How to search for content inside files in Windows 11

Import-Module C:\MyScriptModule.psm1

The magic of PowerShell came with PowerShell 3. self-loading of modulesFor it to work, you have to:

  • Save your module in a folder whose base name matches the module, for example MyScriptModule\MyScriptModule.psm1.
  • Place that folder in some path of the $env:PSModulePath.

The value of $env:PSModulePath It is a list of routes separated by ;You can view it legibly with:

$env:PSModulePath -split ';'

Typical paths include the user's modules folder, the global modules folder in Program Files and the system in System32. The usual is Install your own modules in the "AllUsers" (Program Files) directory. and leave the System32 one only for system modules.

Once the module is in place, When you invoke one of its commands for the first time, PowerShell will automatically load it. without needing to call Import-Module manually, which is very convenient in scripts and interactive sessions.

Module manifests: metadata, version, and function export

For a module to be properly defined, the .psm1 file is not enough: You need a .psd1 manifestwhich is nothing more than a PowerShell data file with module metadata (name, version, author, description, dependencies, exported functions, etc.).

The standard way to create it is by using New-ModuleManifestat least passing the Path from the .psd1 and the parameter RootModule pointing to .psm1. Also, if you plan to publish the module in a repository like PowerShell Gallery, Fields such as Author, Description, or CompanyName are required.

One clue that your module is missing a manifest is to check its version with Get-Module -Name MyScriptModule and see a version number 0.0, a clear symptom that there is only one loose .psm1 file without an associated .psd1 file.

Typical pattern To create the manifesto, it would look something like this:

$moduleManifestParams = @{
Path = "$env:ProgramFiles\WindowsPowerShell\Modules\MyScriptModule\MyScriptModule.psd1"
RootModule = 'MyScriptModule'
Author = 'Autor del módulo'
Description = 'Funciones reutilizables para administración'
CompanyName = 'MiEmpresa'
}
New-ModuleManifest @moduleManifestParams

Later on, if you need to change any data, instead of recreating the manifest (which would generate a new GUID and can break dependencies), the correct thing to use is Update-ModuleManifest to modify only what is necessary.

The manifesto also includes key sections such as FunctionsToExportwhere you can list exactly which module functions are exposed to the user, leaving the rest as internal private functions to the module.

Public and private functions in your modules

In a real-world module, you don't want to expose everything. It's normal to have internal auxiliary functions that are only used by each otherand that you don't want users calling directly. There are two main ways to control what goes "outside":

If you don't have a manifest (a simple module with only .psm1), you can control visibility with Export-ModuleMember within the .psm1 file itself. For example:

function Get-MrPSVersion { $PSVersionTable }
function Get-MrComputerName { $env:COMPUTERNAME }
Export-ModuleMember -Function Get-MrPSVersion

In that case, only Get-MrPSVersion will be visible as a module commandWhile Get-MrComputerName will still be accessible from other functions of the module but will not be publicly exportedYou can check which commands are exposed with:

Get-Command -Module MyScriptModule

If you have a manifest, the cleanest way is to define that list in the key. FunctionsToExport from the .psd1 file, for example:

FunctionsToExport = 'Get-MrPSVersion'

It is not necessary to duplicate logic by using both at the same time Export-ModuleMember in .psm1 and FunctionsToExport in .psd1Either mechanism is sufficient. The manifesto-based approach is usually clearer in the medium term.

PowerShell Gallery, requirements and publishing of modules and scripts

La PowerShell Gallery It's the main public repository for modules and scripts. From there you install dependencies with Install-Module, you update with Update-Module and you publish your own packages with Publish-Module o Publish-Script.

To publish a module there are a series of minimum requirements Things to keep in mind: a unique name, semantic version, correct manifest, filled-in basic metadata, and, if you use dependencies, that they are also available. Similarly, the scripts you want to publish as such. They need a header with metadata in script file format (author, description, version, etc.).

The connection between PowerShellGet, PackageManagement and NuGet This is roughly how it works: PowerShellGet is the module you use (Install-Module, Publish-Module…), PackageManagement is the generic package management layer, and underneath it all, for the Gallery, you use NuGet as a providerThat means that, even though the backend is NuGet, it's not the same to use nuget.exe that you use PowerShellGet; for most PowerShell module tasks you will always use the PowerShellGet cmdlets.

To publish you need a NuGet API key (NUGET_KEY) used as a credential. In CI/CD environments, it is stored as a secret and exposed at build time to execute a Publish-Module -NuGetApiKey. You can too claim ownership of a package, reserve names, or report license violations directly in the Gallery, following its support flows.

  How to turn 10K into 100K

What is Pester and why should you use it with your modules?

Pester is the standard unit and integration testing framework for PowerShellIt is a module written in PowerShell that defines a very readable, block-based testing language such as Describe, Context, It, BeforeAll, AfterAll and in assertions like Should-Be, Should-Not-Be, etc.

Pester's goal is that Every time you change your code, you can run a set of automated tests. that ensure you continue to meet the expected behavior. This avoids many common pitfalls: fixing one thing in production, breaking something else as a consequence, or having an authentication change in a large project wreck half the platform without anyone seeing it coming.

In practice, Pester allows you to cover both unit tests (isolated functions, with mocks where necessary) like integration testing (your module talking to SharePointAzure, SQL, etc.). For many organizations, Having tests with Pester is a minimum quality standardeven for “simple” administrative scripts.

Pester offers, in addition to assertions, logical grouping of tests, mocking of external commands, and measurement of code coverageThis gives you a lot of visibility into which parts of the module are being covered and which are not.

Installation and basic use of Pester with modules

Modern versions of Windows they usually bring a old version of Pester (v3) pre-installedThe usual way nowadays is to install the updated version from the Gallery with:

Install-Module -Name Pester -Force

Pester works in Windows, Linux and macOS And it's compatible with PowerShell 3, 4, 5, 6, and 7, so you won't have any problems in mixed environments. Once installed, all you have to do is create test files with the suffix .Tests.ps1 or .Test.ps1 that describe the behavior of your modules.

La typical structure A test file includes blocks such as:

  • Before All: It is executed once before all tests (ideal for importing the module or defining common data).
  • describe: groups tests for a function or area of ​​functionality.
  • Context: subdivides scenarios (“when the parameters are correct”, “when there is a credential error”…).
  • ItEach It block is an individual proof with its set of assertions using Should.
  • After all: clean up at the end of all tests (close connections, delete temporary files, etc.).

To launch the tests, simply call Invoke-Pester Pointing to the test file or folder will give you a summary of the number of successful and unsuccessful assertions, along with the execution time and, if you want, detailed output with -Detailed Output.

Practical example: testing a login function with Pester

Imagine a PowerShell function called LoginSP This function encapsulates the login to SharePoint Online using Plug and Play (PnP) or CSOM, and returns a SharePoint context. This function resides in your module or in a .ps1 file that you use in many automation scripts.

If dozens of business functions depend on it, You can't afford for an implementation change (for example, moving from PnP to CSOM) to break everything.The reasonable thing to do is to write unit tests with Pester in a file LoginSharePoint.Test.ps1 or similar.

In that test file, in the block Before All You usually use dot-sourcer or import the module with the LoginSP function. Then you create one or more blocks. Describe "Testing LoginSP" and within contexts for different scenarios: correct parameters, incorrect parameters, incorrect credentials, etc.

Each block It It contains a specific test, for example: “should not return null” or “the URL of the returned context matches the input URL”. In one test you could call the function directly within the Should pipe, in another store the result in a variable to inspect properties and write additional information with Write-Host if necessary.

The execution is carried out with Invoke-Pester ".\LoginSharePoint.Test.ps1" And if you want more detail, you add -Output DetailedIn addition to immediate output, Pester can be integrated with code coverage to see which parts of the module have been executed during testing.

Input parameters in Pester tests and container usage

Inserting credentials or URLs directly into each test is a maintenance nightmare and, moreover, difficult to automate. The elegant approach with Pester is... declare parameters in the test file itselfexactly the same as you would in a PowerShell script:

Param (
[Parameter(Mandatory=$true)] [string]$UserAccount,
[Parameter(Mandatory=$true)] [string]$UserPW,
[Parameter(Mandatory=$true)] [string]$SiteCollUrl
)

With this, you can use within the test $UserAccount, $UserPW and $SiteCollUrl in all tests, without repeating them. To pass these values ​​when you run the tests, Pester provides the PesterContainer:

$myContainer = New-PesterContainer -Path 'LoginSharePoint.Test.ps1' -Data @{
UserAccount = 'usuario@tenant.onmicrosoft.com';
UserPw = 'MiMuySeguraPW';
SiteCollUrl = 'https://tenant.sharepoint.com/sites/TeamSite'
}
Invoke-Pester -Container $myContainer -Output Detailed

This way you can inject different test data without touching the test codeThis is perfect for CI pipelines where credentials, URLs, or tenants change depending on the environment (development, pre-production) and are supplied as secrets or protected variables.

  Methods for linking computer data to a WhatsApp message

Unit and integration testing: what each covers

In Pester's world it is important to distinguish between unit tests y integration testingThe unitary schools focus on small blocks of code in isolation, usually with mocks of external dependencies. For example, a “New-Drawer” function should return a drawer object with four corners, two rails, a handle, the correct color, and the correct dimensions.

Integration tests, for their part, verify that All of these individual components function correctly when connected to real systems.SharePoint, databasesexternal services, etc. You can have all the unit tests in green and the integration can be a disaster if you don't explicitly test it (for example, if the environment where the "box" is set up doesn't fit).

In PowerShell, it's common for your modules to encapsulate all that business logic, and Pester helps you to... ensure that each function does what it should (unitary) like what The combination of functions, modules, and external services works together (integration).

The great advantage is that, once you have the test suite set up, every code change becomes a simple matter of Run Pester and review the results, which protects you against setbacks in large, live projects.

PowerShell, modules, and Pester in CI/CD pipelines

Today it is very common to integrate PowerShell and Pester into continuous integration and delivery pipelines, for example with GitHub ActionsThe GitHub-hosted runners already come with PowerShell 7 and Pester pre-installed, plus the option to install additional modules from the Gallery with Install module.

A typical workflow starts with a test job that, with each push, Run Pester on your moduleA simplified example in YAML might include steps such as:

  • Repository checkout (actions/checkout@v5).
  • Running a simple test with Test-Path resultsfile.log | Should -Be $true to verify that a result is generated.
  • Running a complete test file with Invoke-Pester Unit.Tests.ps1 -Passthru.

In addition to running Pester, it is common Install dependencies from the Gallery (for example, SqlServer o PSScriptAnalyzer), configure the PSGallery repository as trusted and cache the modules to accelerate downloads using actions/cacheIn Linux, for example, the typical module cache is located in ~/.local/share/powershell/Modules, while in Windows the path changes.

Another very useful pattern is to use PSScriptAnalyzer in the pipeline to lint all .ps1 filesThis generates a list of issues by severity and fails the job if there are serious errors. This forces you to maintain a consistent coding style with no obvious errors even before moving on to functionality testing with Pester.

Publishing modules to PowerShell Gallery from CI

Once your Pester tests pass in CI, the logical next step is Automate the publication of your module in the Gallery When you create a new release. In GitHub Actions, this is typically done with a workflow that is triggered by events of type release: created.

El corresponding job usually:

  • Check out the code.
  • Run a build script (for example ./build.ps1 -Path /tmp/samplemodule) that generates the final structure of the module, with .psm1, .psd1 and any additional resources.
  • Read the secret NUGET_KEY of the repository's secrets and expose it as an environment variable.
  • Call to Publish-Module -Path /tmp/samplemodule -NuGetApiKey $env:NUGET_KEY -Verbose to upload the new version to the Gallery.

In this flow, the Publishing secrets are never written in plain text in scriptsThey are managed solely from the CI provider, and the module is only published if Pester's tests have previously passedIt's the cleanest way to avoid breaking downstream users with "broken" versions.

As a final touch, a step is usually added for upload the test artifacts produced by Invoke-Pester (for example, an XML or CliXml file with results) using the action upload-artifactThis allows you to review what went wrong even if the job is marked as failed, thanks to conditions such as if: ${{ always() }} that guarantee the upload even if Pester returns errors.

In the end, combine well-packaged PowerShell modules, comprehensive testing with Pester, and CI/CD pipelines that automatically run, analyze, and publish It allows you to move from loose and fragile scripts to an ecosystem of reusable, versioned and reliable tools, making day-to-day work much more comfortable and predictable.