Get Help:Ask a Question in our Forums|Report a Bug|More Help Resources
Last post Mar 29, 2007 02:57 AM by email@example.com
Mar 29, 2007 12:07 AM|LINK
I'm working on a web project that I anticipate (or hope) will have thousand of users at any given time. This Web app basically uses AJAX asynchronous webservice calls that are long-running (at least 10 minutes per call -- I can't really divulge why the
webservice calls last so long). Basically, everything works as coded, but I've only tested with a few simultaneous users. In theory, hundreds or thousands of users will be using my app at the same time, each calling a webservice through AJAX for lengthy
periods of time (good thing these calls are non-blocking), meaning there could potentially be thousands of persistent connections made to the server at any point in time. In practice, is this even possible, even with web farms? How do I stress test such
an app? I don't have that many users yet, but I don't want to wait until I do to realize that this won't work.
A few things to consider:
2) From what I understand, web service calls made from the client don't use up threads from the ASPNET thread pool. (I could be wrong.)
3) The webservice is not a CPU-intensive calculation or database query, nor does it require any other resources beside a global variable or two.
Please give me some insight on how feasible my solution is. I do have a backup plan, but it would not work quite so well.
Mar 29, 2007 02:57 AM|LINK
Therte's no hard number answer to this, and you're going to need to do some serious testing to establish the best performance for your particular situation. If I recall, the number of worker processes that can be spun up starts to die off at about 500,
but it really depends on RAM, processor capabilities and how the app is designed. If you're seriously looking at high performance and availability, you're going to be looking at a web farm or clustering, certainly load balancing.
Check the forums at www.iis.net for better responses.