Home » Php » php – 1.3M queries/Hour. How would you construct the queries?

php – 1.3M queries/Hour. How would you construct the queries?

Posted by: admin July 12, 2020 Leave a comment


I have an online iphone turnbased game, with lots of games running at the same time. I’m in the process of optimizing the code, since both me and the server have crashed today.

This is the setup:

Right now I have one table, “matches” (70 fields of data for each row. The structure), that keep track of all the active matches. Every 7 seconds, the iphone will connect, download all the matches in the “matches” table that he/she is active in, and update the UI in the iphone.

This worked great until about 1,000 people downloaded the game and played. The server crashed.

So to optimize, I figure I can create a new table called “matches_needs_update”. This table have 2 rows; name and id. The “id” is the same as the match in the “matches” table. When a match is updated, it’s put in this table.

Now, instead for search through the whole “matches” table, the query just check if the player have any matches that need to be updated, and then get those matches from the “matches” table.

My question is twofold:

  1. Is this the optimal solution?
  2. If a player is active in, say 10 matches, is there a good way to get those 10 matches from the “matches” table at the same time, or do I need a for loop doing 10 queries, one for each match:

    “SELECT * FROM matches WHERE id = ?”

Thanks in advance

How to&Answers:

I suggest APC

…as you’re on PHP, and I assume you’re doing this from a single mysql database,

It’s easy to install, and will be default from PHP 6 onwards.

Keep this 1 table in memory and it will fly.


You need to get out of the database. Look to memcache or redis.


Your database looks really small. A table with 70 rows should return within milliseconds and even hundreds of queries per second should work without any problems.

A couple of traditional pointers

  • Make sure you pool your connections. You should never have to do the connect when a customer needs the data.
  • Make sure there is an index on “user is in match” so that the result will be fetched from the index.
  • I’m sure you have enough memory to hold the entire structure in the cache and with these small tables no additional config should be needed.
  • Make sure your schema is normalized. One table for each user. One for each match. And one for each user in a match.


Its time to start caching things eg memcache and apc.

As for looping though the matches… that is the wrong way to go about it.

How is a user connected to a match by a xref tabel? or does the match table have somthing like player1,player2.

Looping though queries is not the way to go properly indexing your tables and doing a join to pull all the active matches by a userId would me more efficient. Givin the number of users you may also want to (if you havent) split the tables up for active and inactive games.

If theres 6000 active games and 3,000,000 inactive its extremely beneficial to partition these tables.