I’m don’t do a ton to big data stuff, but sometimes despite Excels stated row and column support- I find it effectively melts down if even 100K/100 of data and forget adding formulas.
Load GB/32 million-row files in seconds and use them without any crashes using up to about 500GB RAM.
Load/edit in-place/split/merge/clean CSV/text files with up to 32 million rows and 1 million columns.
Use your Python functions as UDF formulas that can return to GS-Calc images and entire CSV files.
Use a set of statistical pivot data functions.
Solver functions virtually without limits for the number of variables.
Create and display all popular chart types with millions of data points instantly.
Suggestions for improvements are welcome (and often implemented quite quickly).
I’m don’t do a ton to big data stuff, but sometimes despite Excels stated row and column support- I find it effectively melts down if even 100K/100 of data and forget adding formulas.
The ability to use legacy spreadsheets and macros.
Let's be real, Excel self perpetuates by at once being awful but also the thing everyone used to use thus must still use.
Lots of spreadsheet apps better than Excel have come and gone over the years...