Relatively little attention was paid to concurrent connections. Enterprise environments tend to be well-regulated, and most applications will have under 1000 simultaneous users (whether human or machine driven). As a result, application servers and related technologies evolved to support high transaction throughput at limited concurrency.
The web on the other hand brought in much higher concurrency requirements, and platforms like WebLogic became default components of web computing environments for sites serving 1,000s people at the same time. This was a breakthrough and led to significant market success in a short time period.
With the rise of cloud computing, two things change. First, mobile applications and the API economy are driving an order of magnitude increase in the number of simultaneous users. Second, these users are often machines rather than people, and therefore aren't limited to the demand patterns of humans users clicking links or refreshing their pages.
This produces a new set of demand patterns which increase both total throughput and peak concurrency. As an example, travel sites like Kayak.com and Bing.com/travel issue hundreds of API requests to airline reservation system backends as a result of a single human-driven query. Furthermore, these requests are being made not just by desktop or web applications but by mobile applications – especially iPhone applications. As most people are aware, the next 10 billion devices that come online will not be PCs or netbooks but will be mobile devices (phones, MIDs, GPS, game units, media players). Each of these is prized for its native application experiences. Each of these devices will be making user-driven and automated calls to cloud services in order to deliver those experiences.
Where backend systems are not protected from this demand, they are being penalized in performance and load management. This causes either outright outages, "web brownouts" where the core website that uses the same backend slows down, or erratic performance across both the web and cloud properties. Again, mobile access exacerbates the issue due to the intermittent nature of mobile internet connectivity, which multiplies the number of connections that need to be set up and torn down as the device comes on and off the network.
So the explosion of concurrent usage is already beginning, as the traffic and backend impact is expanding. To manage this and maintain stability of existing infrastructure, a new layer of infrastructure is emerging, much as HTTP load balancers have evolved to serve the needs of web computing. What we're seeing is the rise of cloud service controllers, a category of infrastructure that works well with existing systems and builds on top of the strengths of application servers, enterprise messaging systems, and application delivery controllers.