←back to thread

353 points HunOL | 2 comments | | HN request time: 0s | source
Show context
dv35z ◴[] No.45781964[source]
Curious if anyone has strategies on how to perform parallel writes to an SQLite database using Python's `multiprocessing` Pool.

I am using it to loop through a database of 11,000 words, hit an HTTP API for each (ChatGPT) and generate example sentences for the word. I would love to be able to asynchronously launch these API calls and have them come back and update the database row when ready, but not sure how to handle the database getting hit by all these writes from (as I understand it) multiple instances of the same Python program/function.

replies(4): >>45781983 #>>45782285 #>>45782358 #>>45791720 #
1. zie ◴[] No.45782285[source]
Technically SQLite can only have 1 writer at any given moment, but it can appear like it works across multiple writers and let it serialize the calls for you.

By default SQLite will not do what you want out of the box. You have to turn on some feature flags(PRAGMA) to get it to behave for you. You need WAL mode, etc read:

* https://kerkour.com/sqlite-for-servers * https://zeroclarkthirty.com/2024-10-19-sqlite-database-is-lo...

My larger question is why multiprocessing? this looks like an IO heavy workload, not CPU bound, so python asyncio or python threads would probably do you better.

multiprocessing is when your resource hog is CPU(probably 1 python process per CPU), not IO bound.

replies(1): >>45784666 #
2. dv35z ◴[] No.45784666[source]
I will check into `asyncio` and Python threads. I used multiprocessing as my first project into asynchronous programming. The previous use-case was using Python + multiprocessing to run MacOS `say` (using Python subprocess) - so I could invoke it 10-20 times simultaneously on my computer, rather than waiting for each to complete. I experimented a bit with how many concurrent processes to run (using `time` to clock how long the runs were).