←back to thread

95 points jpiech | 1 comments | | HN request time: 0.204s | source

Process large (e.g. 4GB+) data sets in a spreadsheet.

Load GB/32 million-row files in seconds and use them without any crashes using up to about 500GB RAM.

Load/edit in-place/split/merge/clean CSV/text files with up to 32 million rows and 1 million columns.

Use your Python functions as UDF formulas that can return to GS-Calc images and entire CSV files.

Use a set of statistical pivot data functions.

Solver functions virtually without limits for the number of variables.

Create and display all popular chart types with millions of data points instantly.

Suggestions for improvements are welcome (and often implemented quite quickly).

Show context
Yiling-J ◴[] No.43800380[source]
https://rowzero.io/ can handle 1 billion+ rows and offers native Python support. Also compatible with Excel and Google Sheets. However it’s a cloud based solution, and the private hosting option is only available to Enterprise users.
replies(1): >>43801014 #
1. ◴[] No.43801014[source]