Scott
Hanselman: Cool. Cool. My second question was if I'm imagining kind of a typical web application scenario where one client hits a website and when in the past that single thread would then go to an ASP.NET server, which would then call to a database, maybe I started doing parallelism and then one person hitting my web server now ends up being four connections to my database and that becomes a standard thing because I started being very familiar with these technologies, it seems to be like that would dramatically change the scale signature of the entire application and would potentially get me then
diskbound quicker. If I'm ultimately not going to become CPU-bound as much, I'm going to become
diskbound, but people are saying already that machines are fast and then
CPUs are barely working and then ultimately we are IO-bound. So, what does this kind of parallelism mean when it comes to IO-intensive operations?
Stephen
Toub: That's sort of two different questions and I'd like to address the first one, which is basically, what does this parallelism stuff have to do with the server? In general, server apps today like ASP.NET, they already have enough parallelism to satisfy the
CPUs. So, you take a typical ASP.NET application, it's getting thousands of requests per second, if each of those requests can be isolated, if it's n
ot accessing shared state or anything like that, you're already introducing a thousand different asynchronous pieces of work. So, unless you're expecting very few requests into your web server and unless each of those very few requests is doing a ton of computation w o r k , using parallel class, for example, in your server application might not buy you all that much, which ties into your second question. Because, like you say, a lot of stuff is IO-bound in server world and so you take something like asynchronous pages in ASP.NET, it's basically meant to limit the number of thread pool resources you're consuming and maximize your throughput while at the same time basically making sure that you're not blocking other people from requesting your server resources just because you're waiting for the database to come back or you're waiting for a web service call to return or something like that. So, a lot of the technologies that we're working on right now, we're focusing largely at desktop or at
backend data processing, not so much at the web server world because for the most part your
frontend web applications have enough concurrency to go around.
Taken from Parallel Programming with .net
Hanselminutes