I have a mixed-language (Python & PHP) web project that leverages any one of several relatively large lists of words (the smallest list is ~13k entries, the largest is >250k entries)
Based on user input, the app will spit back a unique word-set (along the lines of how What3Words maps grids on the earth’s surface to a word.word.word triplicate) drawn from whichever word list has been chosen
Given the overhead of setting up a database connection, querying the database, returning the results, and then closing the connection, when does it make sense to move from individual files (in the format of one word per line – always in the same order) to individual tables in a database?
My hunch (haven’t had usage yet to verify one way or the other myself) is that it’s not going to matter until/unless it’s getting 1000s of hits per second, and maybe exceeds available memory on the machine (loading a 2.5M file into RAM to pluck a few lines from has its own performance drawbacks, too – but it’s simpler to get going initially)
from User warren – Stack Overflow https://stackoverflow.com/questions/67607519/when-does-creating-a-single-large-table-make-more-sense-than-a-single-file-for-a
via IFTTT