Will Cappell

A member of the Gartner Blog Network

Will Cappell
Research Vice President
9 years at Gartner
29 years IT industry

Will Cappelli is a Gartner Research VP in the Enterprise Management area, focusing on automation, event correlation and fault analysis, management system architectures, and real-time infrastructure issues. ...Read Full Bio

Coverage Areas:

AI and IAM: Markets and Algorithms

by wcappell  |  April 26, 2012  |  2 Comments

Supply and demand tend to move towards equilibrium in markets with prices serving as the mechanism by which that equilibrium comes about. The imbalance between supply and demand exerts a downward or upward pressure on prices, the results of which, in turn, modify amounts supplied and amounts demanded in opposite direction until a price emerges that makes supply and demand equal. Until recently, mathematical economists focused almost exclusively on proving the existence and mapping the structure of a market in equilibrium. The last few years, however, have witnessed an explosion of research into the algorithmic processes by which markets come into equilibrium (or by which game players settle on strategies – which comes to much the same thing.)

This study, which usually goes by the name of Algorithmic Game Theory (AGT,) has already had significant commercial impact through its systemization and expansion of our understanding of optimal auction design. As AGT matures further, however, it will also profoundly impact Infrastructure and Application Management (IAM) in a world that revolves around the access of cloud-based services, across mobile interfaces, by users who are embedded within social networks.

• First, it will allow enterprises to design better chargeback and internal pricing systems, both with regard to ensuring that pricing schemes do indeed achieve the resource allocation and behaviour modification effects an enterprise has hoped for and with regard to justifying the very principles on which a resource allocation goal is based.

• Second, it will support the development of automated dynamic decentralized resource allocation systems by working out the principles by which software agents can coordinate local actions without requiring a powerful, centralized manager of managers to ensure that scarce resources are fairly distributed across multiple business needs.

• Third, it will provide the industry with an understanding of how to extract a coherent end to end performance picture across multiple cloud service providers by providing them with a set of incentives to be open without compromising their individual interests.

• Fourth, algorithmic markets are themselves a kind of distributing computing model which could be deployed for the purposes of IAM.

It is interesting to note that one of the issues that bedevils the algorithms capable of driving markets towards equilibrium is computational complexity. The theory of computational complexity segregates algorithms into classes depending upon how the rate at which resources are consumed grows with the size of algorithm inputs. One famous class of algorithms is called P (for Polynomial) which contains those algorithms where the rate of resource consumption (expressed as time it takes for the algorithm to execute) grows according to a polynomial function. P algorithms are generally considered to be efficient.

Another famous class is called NP (for Non-deterministic Polynomial – think of an algorithm with branching paths, each path of which can grow with input according to a polynomial function.) The NP class has many famous members like the algorithm for determining an optimal path through a network and is generally considered to indicate an high degree of inefficiency or resource consumptiveness. It turns out that the complexity or resource consumptiveness of equilibrium discovery algorithms fall somewhere between the level characteristic of P and the level characteristic of NP.

So while such algorithms are not as hungry as general network path optimization algorithms, they could, in theory, start consuming a lot of compute resources very quickly and without much warning, potentially undermining the second and fourth scenarios mentioned above. On the other hand, markets occur in the real world and get to equilibrium pretty rapidly so even if the complexity here is a theoretical problem, it could be the case that in practice (or, in other words, in the neighbour hood of the input sizes we are actually likely to encounter) chances of a resource consumption blow up are small.

2 Comments »

Category: Uncategorized     Tags: ,

2 responses so far ↓

  • 1 Giovanna Leopardi   April 30, 2012 at 9:09 am

    Interesting post! I was wondering if you could expand a little on your fourth point. How can markets be computers?

  • 2 wcappell   April 30, 2012 at 4:21 pm

    Thanks, Giovanna, Yes, I was a bit terse. What I meant to say that markets, understood in terms of algorithms, are essentially machines that take an initial distribution of resources as input and compute an equilibrium position as output. Now, one could use initial resource distributions as codes for, or representations of, more general data and use the the equilibrium positions arrived at as code for or representations of general output. I have no idea as to how how ‘computationally complete’ such a model would be. Anyone out there have an answer to THAT question?