Your organization is finally getting everyone onto Windows 11. Or you've got a new IT mandate to get off legacy Office formats. Either way, you're staring at a file server with hundreds — maybe thousands — of .xls and .mdb files, and you need a plan.
The first thing most IT teams try: a cloud converter. Google "bulk xls to xlsx converter," find CloudConvert or Zamzar, upload a few files, and think the problem is solved.
It isn't. Here's why — and what to do instead.
This is the biggest issue and the one most often discovered after the damage is done.
CloudConvert and Zamzar convert .xls to .xlsx by extracting the data and rebuilding the file in the Open XML format. VBA code is not part of the Open XML data model — it lives in a separate binary section of the .xls file that cloud converters either can't read or deliberately ignore.
The result: every .xls file that contained VBA macros produces an .xlsx output with the macros silently deleted. No warning. No list of affected files. No way to know which files had macros unless you open every one individually.
For an organization with 500 .xls files, some fraction will have macros — and that fraction may include the most critical files. The monthly reports with embedded data refresh logic. The pricing tool that sales uses every day. The HR tracker with the calculation formulas that feed into payroll.
The correct output format for files with VBA is .xlsm, not .xlsx. Cloud converters don't produce .xlsm. A local tool that understands macro content can save each file to the right format automatically.
Zamzar's Business plan caps individual file uploads at 1GB. CloudConvert's limits vary by plan but are similarly constrained.
Financial model workbooks, database export files, and engineering data files routinely exceed these limits. A 2GB Excel file that's been in production for ten years isn't unusual. You can't upload it to a cloud converter. You're stuck.
A local tool processes files entirely in memory on the converting machine. There is no inherent file size limit beyond the machine's available RAM and disk space. A 3GB .xls file processes the same way as a 30KB one.
This is the issue that stops most enterprise bulk migrations from using cloud tools entirely — and it's the right call.
When you upload a file to CloudConvert or Zamzar, that file travels to a server in another country, is processed on shared infrastructure, and is temporarily stored until their retention policy deletes it. Their privacy policies state they delete files after conversion, but "deleted" in cloud infrastructure often means "removed from the index" rather than "securely wiped from all storage layers."
For organizations under GDPR, HIPAA, SOC 2, PCI-DSS, or any internal data classification policy, bulk upload to a cloud converter is not an option for any file that contains:
That's most business files. A local tool processes everything on your hardware, inside your network perimeter, with no data leaving the machine. Your security team signs off on a local app in a way they will never sign off on a cloud upload service.
No upload. No cloud processing. Your files never leave your computer — or your network. That's not a feature we added; it's how the product was designed from day one.
Read our security pageCloud converters charge by conversion or by subscription tier. At 500+ files, the math turns against you quickly.
| Scenario | Cloud Converter (est.) | LegacyLeaps Local |
|---|---|---|
| 100 files, no macros | ~$15–40 | Token bundle (~$10–20) |
| 500 files, mixed macros | ~$75–200 + manual macro fixes | Token bundle + automated fixes |
| 2,000 files, enterprise batch | $300–800+ (and still no macro support) | Done-for-you flat rate |
| File with macros converted to .xlsx | Macros deleted (hidden cost) | Preserved, saved as .xlsm |
| Files > 1GB | Rejected (can't convert) | Processed normally |
The cloud converter price looks lower at first. Add in the cost of discovering that macros were stripped, the IT hours spent identifying which files were affected, and the time to manually recover or rebuild the VBA — and the "cheap" cloud option ends up being the most expensive option on the board.
When your next audit comes around and the auditor asks "how were these files migrated and what was verified," "we uploaded them to a website" is not the answer you want to give.
A local migration tool can generate a conversion report for every file: original format, target format, macro count, formulas preserved, conversion status, and any issues flagged for review. That's an audit trail. A cloud service transaction log is not.
For organizations in regulated industries, documented migration procedures with per-file verification logs aren't optional — they're the difference between a clean audit and a finding.
Here's a practical workflow for IT teams or MSPs handling a bulk legacy file migration:
This is what LegacyLeaps's IT Teams and MSP tiers support. One tool, one workflow, one report — for the whole batch.
To be balanced: cloud converters are fine for specific use cases.
If you have a handful of simple .xls files with no macros, no PII, and no size constraints — a cloud converter gets the job done in two minutes. No installation, no setup. That's a legitimate convenience trade-off for a personal or low-stakes use case.
The problem isn't that cloud converters exist. The problem is using a consumer convenience tool to handle an enterprise data migration project — because the failure modes aren't visible until after the damage is done.
LegacyLeaps's IT Teams and MSP tiers are built for this. Batch processing, conversion reports, VBA preservation, local-only processing. Talk to us about your file inventory and we'll scope the project.
See IT Teams PricingPractical fixes for legacy Excel and Access problems. No spam.