Tienes un archivo CSV. Tal vez es una exportación de base de datos, un volcado de registros de un servidor web, o un conjunto de datos que descargaste para análisis. Haces doble clic, Excel comienza a cargar, y luego — nada. La rueda giratoria. El mensaje "no responde". El cuelgue.
Esta es una de las frustraciones más comunes para cualquiera que trabaje con datos. El archivo es perfectamente válido. El problema es que la herramienta que estás usando nunca fue diseñada para archivos tan grandes.
En esta guía, cubriremos por qué los archivos CSV grandes cuelgan las aplicaciones de hojas de cálculo comunes, y luego repasaremos las soluciones reales — desde trucos rápidos de línea de comandos hasta herramientas dedicadas que manejan millones de filas sin despeinarse.
Por qué los archivos CSV grandes cuelgan Excel y Google Sheets
Antes de saltar a las soluciones, ayuda entender qué está saliendo mal. Cuando abres un archivo CSV en Excel, Google Sheets o la mayoría de las aplicaciones de hojas de cálculo, el software intenta hacer dos cosas a la vez: analizar el archivo completo en memoria y renderizarlo todo en una cuadrícula. Para un archivo de 50 MB con 500,000 filas, eso podría funcionar. Para un archivo de 500 MB con 5 millones de filas, generalmente no.
Hay tres razones principales por las que estas herramientas fallan con CSVs grandes:
- Hard row limits. Excel has a maximum of 1,048,576 rows per worksheet. If your CSV has more rows than that, Excel will silently truncate it or refuse to open it. Google Sheets caps out at 10 million cells total (rows times columns), which means a 20-column file maxes out at 500,000 rows.
- Memory consumption. Spreadsheet applications load the entire file into RAM and then create an internal representation that is often 5-10x larger than the file on disk. A 1 GB CSV can easily require 5-10 GB of RAM to display, which exceeds what most machines have available for a single application.
- Rendering overhead. These tools try to calculate cell widths, apply formatting, and build a scrollable viewport for every single row at once. With millions of rows, the rendering engine bogs down even if the data fits in memory.
| Herramienta | Límite de filas | Límite práctico de tamaño |
|---|---|---|
| Microsoft Excel | 1,048,576 | ~200-500 MB |
| Google Sheets | ~500K (20 cols) | Límite de importación ~50 MB |
| LibreOffice Calc | 1,048,576 | ~200-500 MB |
| Apple Numbers | 1,000,000 | ~100 MB |
If your file exceeds any of these limits, you need a different approach.
Enfoque 1: Herramientas de línea de comandos
If you just need to inspect a large CSV — peek at the first few rows, count lines, or extract a subset — command-line tools are the fastest option. These tools stream through files line by line, so they use almost no memory regardless of file size.
Previsualiza las primeras filas
The head command prints the first N lines of a file. This is the quickest way to see what your CSV looks like:
Cuenta filas
Before deciding how to handle a file, it helps to know how many rows you are dealing with:
Extrae columnas específicas o filtra filas
Use awk or cut to pull out specific columns, or grep to filter rows matching a pattern:
Pros: Instant startup, near-zero memory usage, works on files of any size. Pre-installed on macOS and Linux.
Cons: No visual interface. You need to know the commands. Sorting and complex filtering require more advanced usage. Not available natively on Windows (though WSL provides access).
Enfoque 2: Python con pandas
If you need to do real analysis — aggregations, joins, pivots — Python with the pandas library is the go-to tool for many data professionals. The key for large files is to read them in chunks rather than all at once:
If your file fits in memory (roughly, if your machine has 2x the file size in free RAM), you can read it all at once:
Pros: Extremely powerful for analysis. Handles files well beyond Excel's limits. Large ecosystem of tools (matplotlib, scikit-learn, etc.).
Cons: Requires Python knowledge. Setup can be intimidating for non-programmers. Still memory-bound for very large files (consider Polars or DuckDB for 10 GB+ files).
Enfoque 3: Visores de archivos grandes dedicados
Not everyone wants to write code or use the command line. Several tools are specifically designed to open large CSV files with a graphical interface:
- Viztab — opens multi-GB CSV files in your browser with no upload, streaming rendering, and a full spreadsheet interface (sort, filter, formulas).
- CSVed — a lightweight Windows-only CSV editor that handles large files better than Excel.
- EmEditor — a text editor for Windows that can open files up to 248 GB with CSV-specific features.
- Miller (mlr) — a command-line tool that is like awk but CSV-aware, good for data transformations.
The advantage of dedicated viewers is that they stream data rather than loading it all into memory. You can scroll through millions of rows as if the file were small.
Cómo abrir un CSV grande en Viztab
If you want a graphical spreadsheet experience without the crashes, Viztab is the fastest path. It works directly in your browser and processes everything locally — your data never leaves your machine.
Abre Viztab
Go to viztab.com/app in any modern browser. No signup, no install, no account required.
Suelta tu CSV
Drag and drop your CSV file onto the app, or click to browse. Files up to multiple gigabytes are supported.
Explora tus datos
Scroll, sort, filter, and search across every row. Apply any of 370+ formulas. Export as CSV or XLSX when done.
Viztab auto-detects delimiters (comma, tab, semicolon, pipe) and handles quoted fields, multiline values, and UTF-8 encoding correctly. It uses a high-performance streaming engine that indexes your data as it loads, so you can start working before the file finishes importing.
Consejos para trabajar con archivos CSV grandes
Regardless of which tool you choose, these practices will save you time:
- Know your row count first. Run
wc -l filename.csvbefore trying to open the file. If it is over a million rows, skip Excel entirely. - Check the file size. As a rule of thumb, files under 100 MB usually work in most spreadsheets. Between 100 MB and 500 MB, expect slowness. Over 500 MB, use a dedicated tool.
- Split if you must. If you absolutely need Excel, split the file into smaller chunks first:
split -l 500000 large_data.csv part_will create files of 500,000 rows each. - Use the right encoding. Large CSV files are sometimes in non-UTF-8 encodings (Latin-1, Windows-1252). If you see garbled characters, check the encoding with
file large_data.csvand convert withiconvif needed. - Compress for storage. CSV files compress extremely well. A 1 GB CSV might compress to 100 MB with gzip. Keep the compressed version and decompress only when working with it.
Preguntas frecuentes
Excel can open CSV files up to about 1,048,576 rows (the Excel row limit). However, it often crashes or becomes unresponsive with files much smaller than that due to memory constraints. Files over 200-300 MB typically cause problems even when row counts are within limits.
Google Sheets has a limit of 10 million cells. For a CSV with 20 columns, that is 500,000 rows. Files larger than about 50 MB will fail to import entirely. Google Sheets is designed for collaboration, not for large-scale data work.
For a 5 GB CSV, your best options are: a dedicated large-file viewer like Viztab (handles multi-GB files in the browser), command-line tools like awk or csvkit for quick analysis, or Python with pandas using chunked reading. Standard spreadsheet applications will not work at this size.
Yes. Command-line tools like head, tail, and awk are free and pre-installed on macOS and Linux. Viztab offers free CSV viewing for files up to 1,000 rows with a graphical interface. For unlimited rows, Viztab Pro is available. Python with pandas is another free option if you are comfortable with code.
Deja de pelear con tus herramientas
Viztab abre los archivos CSV que cuelgan todo lo demás. Sin instalación, sin subida, sin límites de filas.
Abrir Viztab