You are importing a database export into Excel. The file has 2 million rows. Excel opens it, and everything looks fine — until you scroll to the bottom and realize you are looking at row 1,048,576. Half your data is gone. Excel did not warn you. It did not throw an error. It just quietly dropped a million rows.
This is the Excel row limit, and it has been tripping people up since 2007. Here is everything you need to know about it — what the limit actually is, why it exists, what happens when you hit it, and how to work around it.
The numbers: Excel's row and column limits
Excel has two sets of limits depending on the file format:
| Format | Max Rows | Max Columns | Excel Versions |
|---|---|---|---|
| XLSX (modern) | 1,048,576 | 16,384 (XFD) | Excel 2007 and later |
| XLS (legacy) | 65,536 | 256 (IV) | Excel 97-2003 |
| XLSB (binary) | 1,048,576 | 16,384 (XFD) | Excel 2007 and later |
The number 1,048,576 is 220. The column limit of 16,384 is 214. These are not arbitrary numbers — they are powers of two chosen for efficient memory addressing in the underlying data structures.
These limits are per worksheet, not per workbook. You can have multiple worksheets in a single workbook, each with up to 1,048,576 rows. However, the practical limit is memory: Excel loads all sheets into RAM, so a workbook with ten sheets of a million rows each would require enormous amounts of memory.
Why Microsoft chose this limit
The original Excel (1985) had a limit of 16,384 rows. Excel 97 raised it to 65,536 (216). Excel 2007 raised it again to 1,048,576 (220). Each jump was a 16x increase.
Microsoft has not raised the limit since 2007 — nearly 20 years. There are practical reasons for this:
- Memory management. Excel loads the entire worksheet into memory. With 1 million rows and 50 columns of data, a single sheet can consume several gigabytes of RAM. Raising the limit to 10 million rows would make Excel unusable on most hardware.
- File format constraints. The XLSX format (which is a ZIP archive of XML files) uses fixed-size references for cell addresses. Changing the maximum would break backward compatibility with every XLSX file ever created.
- Performance expectations. Users expect Excel to be responsive — scrolling, filtering, and recalculating should feel instant. Microsoft determined that 1 million rows is roughly the upper bound for maintaining that experience on typical hardware.
In short, the limit is not about what Excel could handle in theory. It is about what it can handle while remaining usable.
What happens when you exceed the limit
The consequences of hitting the row limit depend on how you are getting data into Excel:
Opening a CSV with more than 1M rows
This is the most dangerous scenario. When you open a CSV file in Excel that has more than 1,048,576 rows, Excel silently truncates the file. It loads the first 1,048,576 rows (including the header) and discards everything else. There is no warning dialog, no error message, and no indication that data was lost.
If you then save this file, you have permanently lost the truncated rows unless you have a backup of the original CSV.
Importing data via Power Query or Get Data
Power Query is slightly better: it will load up to the row limit and show a warning that results were truncated. But the end result is the same — you cannot get more than 1,048,576 rows into a worksheet.
Copy-pasting from another source
If you try to paste more rows than the worksheet can hold, Excel will show an error: "This operation would cause cells to be shifted off the worksheet." It refuses the paste entirely rather than truncating.
Formulas referencing beyond the limit
You cannot create a formula that references a row beyond 1,048,576. Attempting to type =A1048577 produces a #REF! error. This means even indirect references (via INDIRECT or OFFSET) cannot access data beyond the limit.
Solution 1: Split the file into multiple sheets
The simplest workaround is to split your large file into chunks that fit within Excel's limit. You can do this with a quick command-line operation:
Or with Python:
Pros: Simple, uses tools you already know.
Cons: Your data is now fragmented across multiple files. Cross-file analysis (like finding the max value across all 2 million rows) requires manual merging or formulas that reference multiple workbooks.
Solution 2: Use Power Query to aggregate before loading
Power Query (built into Excel since 2016) can connect to large data sources and transform the data before loading it into a worksheet. Instead of loading all 2 million rows, you can filter, group, or summarize down to a manageable result set.
To use Power Query with a large CSV:
- Go to Data > Get Data > From File > From Text/CSV
- Select your file. In the preview, click Transform Data (not Load)
- In the Power Query editor, apply filters, group-by operations, or remove unnecessary columns
- Click Close & Load to send the reduced dataset to a worksheet
Power Query processes data in a streaming fashion, so it can handle files much larger than the worksheet limit. The key is that your output must fit within 1,048,576 rows.
Solution 3: Use Microsoft Access for larger datasets
Microsoft Access can handle up to 2 GB per database (roughly 20-30 million rows depending on column types). If you are already in the Microsoft ecosystem and your data fits within this limit, Access is a natural step up from Excel.
You can link an Access query back to Excel, using Excel as the front end for reports and charts while Access handles the data storage.
Pros: Stays within Microsoft ecosystem, supports SQL queries, handles millions of rows.
Cons: Access is Windows-only, not included in all Microsoft 365 plans, has its own 2 GB limit, and has a dated interface that many users find unintuitive.
Solution 4: Open the file in Viztab
If you need the spreadsheet experience — scrolling through rows, sorting columns, filtering data, applying formulas — but without the row limit, Viztab is purpose-built for this scenario.
Open Viztab
Go to viztab.com/app in Chrome, Firefox, Safari, or Edge. No install or account needed.
Import your file
Drag in your CSV or XLSX file. Viztab handles millions of rows — no truncation, no 1,048,576 cap.
Work without limits
Sort, filter, search, and use 370+ formulas across your entire dataset. Export the result when done.
Viztab processes everything locally — your data is never uploaded to a server. It uses a streaming engine that indexes data as it loads, so you can start working immediately even on multi-gigabyte files. There is no row limit, no column limit, and no file size cap beyond your machine's available memory.
How to check if your data was truncated
If you have been working with a large file in Excel and are now worried about truncation, here is how to check:
- Check the original file's row count. Open a terminal and run
wc -l yourfile.csv. If it reports more than 1,048,577 (data rows plus header), Excel truncated it. - Check Excel's last row. Press
Ctrl+End(Windows) orCmd+Fn+Right(Mac) to jump to the last cell with data. If you land on row 1,048,576, your data was likely truncated. - Compare totals. If you know the expected sum or count of a column from the source system, compare it against Excel's result. A mismatch means data was lost.
The most insidious aspect of Excel's truncation is that it happens silently. Always verify row counts when importing large files.
Frequently asked questions
Excel supports a maximum of 1,048,576 rows per worksheet in the XLSX format (Excel 2007 and later). The older XLS format (Excel 97-2003) is limited to 65,536 rows. These are hard limits built into the file format and cannot be changed.
When you open a CSV with more than 1,048,576 rows in Excel, it silently truncates the file — loading only the first 1,048,576 rows and discarding the rest without warning. This is dangerous because you may not realize data is missing, leading to incorrect analysis or reports.
No. The 1,048,576 row limit is a fundamental constraint of the XLSX file format. No Excel setting, add-in, or version can increase it. Microsoft has not changed this limit since Excel 2007. For datasets larger than this, you need a different tool.
For interactive spreadsheet work with more than 1 million rows, Viztab handles multi-million row files in the browser with no row limits. For programmatic analysis, Python with pandas or R are popular. For SQL-based analysis, DuckDB or loading data into a database are good options. Power Query within Excel can also connect to large external data sources.
1,048,576 rows is not enough
Viztab opens the files that Excel truncates. No row limits, no silent data loss, no install required.
Open Viztab