I have web pages that take 10 – 20 database queries in order to get all the required data.
Normally after a query is sent out, the Django thread/process is blocked waiting for the results to come back, then it’d resume execution until it reaches the next query.
Is there’s any way to issue all queries asynchronously so that they can be processed by the database server(s) in parallel?
I’m using MySQL but would like to hear about solutions for other databases too. For example I heard that Postgresql has an async client library – how would I use that in this case?
This very recent blog entry seems to imply that it’s not built in to either the django or rails frameworks. I think it covers the issue well and is quite worth a read along with the comments.
I think I remember Cal Henderson mentioning this deficiency somewhere in his excellent speech http://www.youtube.com/watch?v=i6Fr65PFqfk
My naive guess is you might be able to hack something with separate python libraries but you would lose a lot of the ORM/template lazy evaluation stuff django gives to the point you might as well be using another stack. Then again if you are only optimizing a few views in a large django project it might be fine.
Just load the template with basic markup and then do severl ajax requsts to execute the queries and load the data. You can even show loading animation. User will have a web 2.0 feel instead of just gloomy page loading. Ofcourse, this means several more HTTP requests per page, but it’s up to you to decide.
Here is how my example looks:
http://artiox.lv/en/search?query=test&where_to_search=all (broken link)
Try Celery, there’s a bit of a overhead of having to run a
ampq server, but it might do what you want. Not sure about concurrency of the DB tho. Also, if you want speed for your DB, I’d recommend MongoDB (but you’ll need
django-nonrel for that).