←back to thread

369 points HunOL | 1 comments | | HN request time: 0.211s | source
Show context
dv35z ◴[] No.45781964[source]
Curious if anyone has strategies on how to perform parallel writes to an SQLite database using Python's `multiprocessing` Pool.

I am using it to loop through a database of 11,000 words, hit an HTTP API for each (ChatGPT) and generate example sentences for the word. I would love to be able to asynchronously launch these API calls and have them come back and update the database row when ready, but not sure how to handle the database getting hit by all these writes from (as I understand it) multiple instances of the same Python program/function.

replies(4): >>45781983 #>>45782285 #>>45782358 #>>45791720 #
1. crazygringo ◴[] No.45791720[source]
It should just work.

If one thread is writing another thread tries to write, the first thread will have the file write lock, and the second thread will wait to write until that lock is released.

I've written code using the pattern you describe and it's totally fine.