Home » Android » android – How does a server handle web service requests from multiple clients

android – How does a server handle web service requests from multiple clients

Posted by: admin June 15, 2020 Leave a comment

Questions:

I just completed an Android application that uses web services to connect to a remote database. I was working on localhost.

Now, I plan to host my web services on a server. Let’s say I have my Android application installed on any number of different client smartphones. Each smartphone user calls the web service at the same time.

Now how does the server handle these requests? Does it execute one thread per request? I want to know about the server processing in detail. Considering, all phones use GPRS, will there be any sort of delay in such a situation?

BTW, my web services are all SOAP based and the server I plan to use later will be an SQL Server. I have used .NET framework for creating web services.

How to&Answers:

Its for the general concept, not a Android specific

Usually, each of the users sends an HTTP request for the page. The server receives the requests and delegates them to different workers (processes or threads).

Depending on the URL given, the server reads a file and sends it back to the user. If the file is a dynamic file such as a PHP file, the file is executed before it’s sent back to the user.

Once the requested file has been sent back, the server usually closes the connection after a few seconds.

Look at How Web Servers Work

EDIT:

For HTTP uses TCP which is a connection-based protocol. That is, clients establish a TCP connection while they’re communicating with the server.

Multiple clients are allowed to connect to the same destination port on the same destination machine at the same time. The server just opens up multiple simultaneous connections.

Apache (and most other HTTP servers) have a multi-processing module (MPM). This is responsible for allocating Apache threads/processes to handle connections. These processes or threads can then run in parallel on their own connection, without blocking each other. Apache’s MPM also tends to keep open “spare” threads or processes even when no connections are open, which helps speed up subsequent requests.

Note:

One of the most common issues with multi-threading is “race conditions”– where you two requests are doing the same thing (“racing” to do the same thing), if it is a single resource, one of them is going to win. If they both insert a record into the database, they can’t both get the same id– one of them will win. So you need to be careful when writing code to realize other requests are going on at the same time and may modify your database, write files or change globals.

Answer:

The server will maintain a thread pool listening for incoming requests. Upon receiving a request, the thread will process the request and return the response. If all the requests are received at the same time and they’re fewer than the maximum number of threads in the pool, they will all be services in parallel (though the actual processing will be interleaved based on the number of cores/cpus). If there are more requests than threads, the request will be queued (waiting for a connection) until either a thread frees up or the client request times out.

If you’re connecting to the service from a mobile network, there is higher latency in the initial connection but not enough to make a difference.

Answer:

Your question is not really related to Android but to mobile development with web backend.

I don’t know how to use .NET for server app development but if you take the example of a Apache/PHP/MySQL, each requests are run in a separated thread.

There might be small latency delays while the request reach the server but this shouldn’t affect the time taken by your server to process the request and the data.

One of the thing to think about is to avoid sending multiple requests from one same client. This is a common implementation problem : since you have no data already returned, you think there are no pending request and you launch a new request. This can basically create unnecessary load on your server.

Hope that helps !

Answer:

a) one instance of the web service( example: spring boot micro service) runs/listens in the server machine at port like 80.

b) This webservice(Spring boot app) needs a servlet container like mostly tomcat.
This container will have thread pool configured.

c) when ever request come from different users simultaneously, this container will
assign each thread from the pool for each of the incoming requests.

d) Since the server side web service code will have beans(in case java) mostly
singleton, each thread pertaining to each request will call the singleton API’s
and if there is a need for Database access , then synchronization of these
threads is needed which is done through the @transactional annotation. This
annotation synchronizes the database operation.