Find All Legacy Excel Files on Your Network with PowerShell

March 19, 2026 · 8 min read · Excel Migration

The hardest part of any Windows 10 EOL migration isn't the actual conversion — it's the discovery. IT teams consistently underestimate how many legacy .xls, .xlsm, and .mdb files are hiding across their file servers, shared drives, and user home directories.

This guide gives you a ready-to-run PowerShell script that finds every legacy Office file on your network and exports a CSV you can use as your migration inventory. No third-party tools, no licenses, no setup beyond standard Windows PowerShell.

The Quick Version

If you're in a hurry, here's the one-liner. Run it in an elevated PowerShell session, replacing \\fileserver\share with your actual path:

Get-ChildItem -Path "\\fileserver\share" -Recurse -Include "*.xls","*.xlsm","*.xlsb","*.mdb","*.mde","*.accde" -ErrorAction SilentlyContinue |
  Select-Object FullName, Length, LastWriteTime, @{N="Owner";E={(Get-Acl $_.FullName).Owner}} |
  Export-Csv -Path "$env:USERPROFILE\Desktop\legacy-files-inventory.csv" -NoTypeInformation

This will output a CSV to your desktop with every legacy file it finds, along with file size, last modified date, and owner. For most file servers with under 500,000 files, this runs in under 10 minutes.

Tip: Run this with -ErrorAction SilentlyContinue to skip permission-denied paths without stopping the scan. Check the CSV for gaps and re-run specific directories with elevated permissions if needed.

The Full Script (Production-Ready)

The one-liner above is fine for a quick scan. For production use — especially if you're scanning multiple servers or need to handle errors gracefully — use the full version:

# Legacy Office File Discovery Script
# Run in elevated PowerShell session
# Usage: .\Find-LegacyOfficeFiles.ps1 -Paths "\\server1\share","\\server2\data" -OutputPath "C:\Reports\legacy-inventory.csv"

param(
    [string[]]$Paths = @("C:\Users"),
    [string]$OutputPath = "$env:USERPROFILE\Desktop\legacy-files-$(Get-Date -Format 'yyyy-MM-dd').csv",
    [switch]$IncludeOwner = $false  # Slower — requires ACL lookup per file
)

$LegacyExtensions = @("*.xls", "*.xlsm", "*.xlsb", "*.xla", "*.xlam", "*.mdb", "*.mde", "*.accde", "*.accdb")
$Results = [System.Collections.Generic.List[PSObject]]::new()
$ErrorLog = [System.Collections.Generic.List[string]]::new()

Write-Host "Scanning $($Paths.Count) path(s) for legacy Office files..." -ForegroundColor Cyan

foreach ($Path in $Paths) {
    Write-Host "  Scanning: $Path"
    try {
        $Files = Get-ChildItem -Path $Path -Recurse -Include $LegacyExtensions -ErrorAction SilentlyContinue -ErrorVariable ScanErrors
        foreach ($File in $Files) {
            $Entry = [PSCustomObject]@{
                FullPath     = $File.FullName
                FileName     = $File.Name
                Extension    = $File.Extension.ToLower()
                SizeKB       = [math]::Round($File.Length / 1KB, 1)
                LastModified = $File.LastWriteTime.ToString("yyyy-MM-dd")
                LastAccessed = $File.LastAccessTime.ToString("yyyy-MM-dd")
                Directory    = $File.DirectoryName
                Owner        = if ($IncludeOwner) { try { (Get-Acl $File.FullName).Owner } catch { "N/A" } } else { "" }
            }
            $Results.Add($Entry)
        }
        foreach ($Err in $ScanErrors) {
            $ErrorLog.Add("ACCESS DENIED: $($Err.TargetObject)")
        }
    } catch {
        $ErrorLog.Add("SCAN FAILED: $Path — $($_.Exception.Message)")
    }
}

# Export results
$Results | Sort-Object LastModified -Descending | Export-Csv -Path $OutputPath -NoTypeInformation
Write-Host ""
Write-Host "Done. Found $($Results.Count) legacy files." -ForegroundColor Green
Write-Host "Results saved to: $OutputPath" -ForegroundColor Green

if ($ErrorLog.Count -gt 0) {
    $ErrorPath = $OutputPath -replace "\.csv$", "-errors.txt"
    $ErrorLog | Out-File -FilePath $ErrorPath
    Write-Host "$($ErrorLog.Count) access errors logged to: $ErrorPath" -ForegroundColor Yellow
}

# Summary by extension
Write-Host ""
Write-Host "Summary by file type:" -ForegroundColor Cyan
$Results | Group-Object Extension | Sort-Object Count -Descending | ForEach-Object {
    Write-Host "  $($_.Name.PadRight(10)) $($_.Count) files"
}

How to run it

  1. Save the script as Find-LegacyOfficeFiles.ps1
  2. Open PowerShell as Administrator
  3. If needed, allow script execution: Set-ExecutionPolicy RemoteSigned -Scope Process
  4. Run: .\Find-LegacyOfficeFiles.ps1 -Paths "\\fileserver\share","\\nas\data" -OutputPath "C:\Reports\inventory.csv"
  5. Add -IncludeOwner if you need file owners (slower — adds an ACL lookup per file)

Free scan — see what's inside your files

Once you have your inventory CSV, use LegacyLeaps's free scan to inspect your highest-priority files. You'll see every formula, VBA module, ActiveX control, and macro — before you commit to migration.

Try the Free Scan

What to Do with the Inventory CSV

Open the CSV in Excel. Here's how to work through it:

Step 1: Sort by LastModified (descending)

Files modified in the last 12 months are almost certainly still in active use. These are your Tier 1 migration priorities. Files last modified 3+ years ago are candidates for archival.

Step 2: Filter by extension

ExtensionWhat it isMigration urgency
.xlsLegacy Excel workbook (Excel 97–2003)High — format limit is 65,536 rows
.xlsmExcel with macros (open XML format)Medium — format is modern, but validate macros
.xlsbExcel binary workbookMedium — consider converting to .xlsx for compatibility
.mdbLegacy Access database (Jet engine)Critical — 32-bit only, breaks on Windows 11
.mdeCompiled Access databaseCritical — cannot be decompiled, source needed
.accdeCompiled Access database (newer)High — cannot be decompiled without source

Step 3: Flag the .mde files immediately

.mde files are compiled Access databases. They cannot be decompiled — you need the original .mdb source to migrate them. If you find .mde files in your inventory and don't have the source, escalate to find the original developer or source file before doing anything else. This is the scenario that causes the most post-EOL emergencies.

Step 4: Identify large files for done-for-you service

Sort by SizeKB descending. Access databases over 50MB, or Excel files over 10MB with macros, are candidates for LegacyLeaps's done-for-you service — large files tend to have more complex logic that benefits from human validation.

Scanning SharePoint and OneDrive

The PowerShell script above covers SMB file shares. For Microsoft 365 environments:

SharePoint Online

Use the SharePoint PnP PowerShell module:

# Requires PnP.PowerShell module: Install-Module PnP.PowerShell
Connect-PnPOnline -Url "https://yourtenant.sharepoint.com/sites/yoursite" -Interactive

$Files = Get-PnPListItem -List "Documents" -PageSize 500 -Fields "FileLeafRef","FileRef","File_x0020_Size","Modified" |
  Where-Object { $_.FieldValues.FileLeafRef -match "\.(xls|xlsm|xlsb|mdb|mde)$" }

$Files | Select-Object @{N="Name";E={$_.FieldValues.FileLeafRef}},
                       @{N="Path";E={$_.FieldValues.FileRef}},
                       @{N="SizeKB";E={[math]::Round($_.FieldValues.File_x0020_Size/1KB,1)}},
                       @{N="Modified";E={$_.FieldValues.Modified}} |
  Export-Csv -Path "$env:USERPROFILE\Desktop\sharepoint-legacy-files.csv" -NoTypeInformation

Personal OneDrive

For personal OneDrive libraries, have users sync their OneDrive locally (if not already) and run the file system scan against their local sync folder. The synced path is typically C:\Users\[username]\OneDrive - [Company Name]\.

Handling the Results

Once you have a complete inventory, the next step is triage. Here's a practical tier system:

Pro tip for MSPs: Package this script as a client deliverable. Run it, clean up the CSV, add a "Priority" column with your Tier 1/2/3 assessment, and present it in your Q2 EOL planning review. It positions you as proactive and creates a natural conversation about migration services.

After the Inventory: What Comes Next

The inventory tells you what you have. LegacyLeaps's free scan tells you what's inside each file — which is what determines migration complexity. A 50KB .xls with 12 VBA modules and 3 ActiveX controls is a completely different animal than a 50KB .xls with a simple data table.

The typical next step is to take your Tier 1 files, run them through LegacyLeaps's free scan, review the results, and use that to scope your migration project accurately. No surprises, no guessing.

For the full month-by-month migration plan, see our companion guide: Windows 10 EOL in 6 Months: Your Legacy File Migration Roadmap.

Frequently Asked Questions

Can I run this without admin rights?

You can run it with standard user rights on paths you have read access to. To scan all users' home directories and system-level paths, you'll need local admin or domain admin rights. For network shares, you need at least read permission on each share.

Will this scan SharePoint or OneDrive?

This script scans local paths and SMB network shares. For SharePoint Online and OneDrive, use the PnP PowerShell approach shown above or the Microsoft 365 Compliance Center's Content Search feature.

How long will it take on a large network?

A typical file server with 100,000 files takes 5–20 minutes. Very large shares (1M+ files) can take 1–2 hours. Consider breaking the scan into top-level subdirectories and running them in parallel PowerShell sessions.

Ready to migrate your Tier 1 files?

LegacyLeaps preserves your formulas, macros, and formatting. 100% money-back guarantee if it doesn't. Start with the free scan — no account required.

Download Free Scanner

Related Reading

Get tips like this in your inbox

Practical fixes for legacy Excel and Access problems. No spam.

← Back to all posts