←back to thread

112 points jpiech | 1 comments | | HN request time: 0s | source

Process large (e.g. 4GB+) data sets in a spreadsheet.

Load GB/32 million-row files in seconds and use them without any crashes using up to about 500GB RAM.

Load/edit in-place/split/merge/clean CSV/text files with up to 32 million rows and 1 million columns.

Use your Python functions as UDF formulas that can return to GS-Calc images and entire CSV files.

Use a set of statistical pivot data functions.

Solver functions virtually without limits for the number of variables.

Create and display all popular chart types with millions of data points instantly.

Suggestions for improvements are welcome (and often implemented quite quickly).

Show context
badmonster ◴[] No.43807498[source]
are there any plans to add real-time collaboration features, like Google Sheets or Excel Online
replies(1): >>43808040 #
1. jpiech ◴[] No.43808040[source]
This is a potentially interesting feature - it probably could be relatively easily extended to the shared environment in GS-Calc as there are already separate passwords to protect files, the sheet/view structures and cell ranges and optional sha256 checksums for all the data categories, but there no plans as such. At the moment you can just use two script commands to release the opened file, periodically check whether it's modified, report it and automatically reload the file.