Persistant MySQL connection in Python for social media harvesting -


i using python stream large amounts of twitter data mysql database. anticipate job running on period of several weeks. have code interacts twitter api , gives me iterator yields lists, each list corresponding database row. need means of maintaining persistent database connection several weeks. right find myself having restart script repeatedly when connection lost, result of mysql being restarted.

does make sense use mysqldb library, catch exceptions , reconnect when necessary? or there made solution part of sqlalchemy or package? ideas appreciated!

i think right answer try , handle connection errors; sounds you'd pulling in larger library feature, while trying , catching how it's done, whatever level of stack it's at. if necessary, multithread these things since they're io-bound (i.e. suitable python gil threading opposed multiprocessing) , decouple production , consumption queue, too, maybe take of load off of database connection.


Popular posts from this blog