2005-04-28

1.2.b Commoditization

In every great technological epoch, commoditization - the standardization of methods, tools and parts in order to facilitate interchangablilty, has served as as a sister-ship to the division of labour and task specialization. The power of commonalization as described in the previous section drives IT infrastructure into a Legoland of interchangeable modular parts in both hardware and software. The components of the former; processors, memory chips, storage devices, and communication channels, are primarily assemblies of commodity items working equally well in thousands of competitive offerings. For the latter, despite the strategy of industry leaders of building walled gardens of computing tools incompatible with those of rival firms, pools of interoperable, interworking software abound. This is what the current software trend known as web services1, is all about – proprietarily unencumbered application modules that facilitate successful interaction between firms and organizations with previously incompatible technology2.

The traditional view of specialization amongst economists, is that larger markets provide for the division of labour, and a narrower range of skills. Yet, as Becker and Murphy point out, the various costs of "coordinating" specialized workers who perform complementary tasks, and the amount of general knowledge available will set limits to just how far this specialization can reach.3

Due to the unprecedentedly high levels of commonalization and commoditization ocurring in IT, coordinating costs are dramatically reduced as specialized tasks are minimized and shifted to the outer edges of a commons-based, generalized problem domain.

Of course, at some point specialization must interwork: the labourers in Adam Smith’s famous pin factory must be able to pass on the results of their specialized skills to the next in line along a serial path – in the production line, the output of one task is the designated input of the next. But the interworking of software modules, when carried out at appropriate levels of granularity, is not serial but networked. The output of one task is potentially the input of any other task and visaversa.

In his amazingly concise, poignant 1943 “What is life?” lecture, the physicist Erwin Schrödinger noted the importance of size. He pointed out that molecular components in a system must be sufficiently large or significant in order to resist mutation, yet sufficiently small or insignificant in order to be replaceable. The stability of large systems, known in biology, as homoeostasis, is attained when its molecular components (cells) are constantly replaced and regenerated. These dynamicly, reproducing “large systems” will play a similar role as the replaceanble components of even still larger systems. We can say that molecular systems are commoditized.

In software, when we speak of the granularity of systems, we are paralleling Schrödinger’s observations on stable, yet evolving, organic life forms. In granular systems, software objects are large enough to resist mutation, yet small enough to join together as the replaceable parts of a larger whole. As in the poetic title of a book by David Weinberger they become Small Pieces Loosely Joined.

1See http://www.w3.org/2002/ws/

2It should also be pointed out that, increasingly, consumers own and operate their own technology when interacting with firms. Technological interoperability is no longer merely a firm to firm consideration.

3The Division of Labor, Coordination Costs, and Knowledge Gary S. Becker, Kevin M. Murphy Quarterly Journal of Economics, Vol. 107, No. 4 (Nov., 1992) , pp. 1137-1160






<< Home

This page is powered by Blogger. Isn't yours?

Subscribe to Posts [Atom]