←back to thread

752 points crazypython | 1 comments | | HN request time: 0.31s | source
Show context
zomglings ◴[] No.26371706[source]
If anyone from the dolt team is reading this, I'd like to make an enquiry:

At bugout.dev, we have an ongoing crawl of public GitHub. We just created a dataset of code snippets crawled from popular GitHub repositories, listed by language, license, github repo, and commit hash and are looking to release it publicly and keep it up-to-date with our GitHub crawl.

The dataset for a single crawl comes in at about 60GB. We uploaded the data to Kaggle because we thought it would be a good place for people to work with the data. Unfortunately, the Kaggle notebook experience is not tailored to such large datasets. Our dataset is in a SQLite database. It takes a long time for the dataset to load into Kaggle notebooks, and I don't think they are provisioned with SSDs as queries take a long time. Our best workaround to this is to partition into 3 datasets on Kaggle - train, eval, and development, but it will be a pain to manage this for every update, especially as we enrich the dataset with results of static analysis, etc.

I'd like to explore hosting the public dataset on Dolthub. If this sounds interesting to you please, reach out to me - email is in my HN profile.

replies(5): >>26371719 #>>26371745 #>>26375077 #>>26383000 #>>26383043 #
zomglings ◴[] No.26371719[source]
This is the dataset on Kaggle - https://www.kaggle.com/simiotic/github-code-snippets
replies(1): >>26371957 #
justinclift ◴[] No.26371957[source]
Yeah, that sized database is likely to be a challenge unless the computer system it's running on has scads of memory.

One of my projects (DBHub.io) is putting effort towards working through the problem with larger sized SQLite databases (~10GB), and that's mainly through using bare metal hosts with lots of memory. eg 64GB, 128GB, etc.

Putting the same data into PostgreSQL, or even MySQL, would likely be much more efficient memory wise. :)

replies(3): >>26372179 #>>26372188 #>>26376054 #
zachmu ◴[] No.26372188[source]
We have 200 GB databases in dolt format that are totally queryable. They don't work well querying on the web though - you need a local copy to query it effectively. Making web query as fast as local is an ongoing project.
replies(1): >>26373052 #
1. justinclift ◴[] No.26373052[source]
Yeah the "on the web" piece is the thing we're talking about. :)

200GB databases for PostgreSQL (etc) isn't any kind of amazing.