I saw this post and it motivated me to do a quick test:
https://phanpy.social/#/hachyderm.io/s/115891592999188880Stop opening huge files in screen editors.
Screen editors (nvi, vim, etc.) assume you want to scroll,
see context, and move a cursor interactively.
Huge files break those assumptions.
For large files (1GB+):
Inspect: head, tail, grep
Understand structure: awk, sed -n (stream, donโt load)
Surgical changes: ed or sed
Open a screen editor only when you need to rewrite text.
Benchmark (1GB text file):
nvi -> 20.1s (eager line indexing ~25M lines)
vim -> 7.7s (lazy loading, deferred UI cost)
ed -> 4.0s (I/O-bound buffering, no TUI overhead)
Large files donโt need better editors.
They need better workflows.
For huge files, the right solution is not tuning nvi,
but using the right tools:
shell for inspection, ed for known changes,
and nvi when interactive rewriting is actually needed.
PS:
nvi chooses predictability over perceived speed.
The slowdown is not a flaw โ itโs the cost of correctness
within a screen-editor model.
#vim #vi #ed #unix #linux #sysadmin