Is backward compatibility really necessary? This is again a question of choice and trade-offs. From a rational economic perspective, if the value that you expect to create by giving up backward compatibility is more than the value that you will have with backward compatibility, then go ahead and drop backward compatibility like a hot potato.
Now, I'm going to diverge a little and take a moment to talk about value. What is value? If a customer is willing to pay an extra dollar for a feature, then the feature has value. Therefore, value is in the mind of the customer and somewhat related to the concept of utility in economics.
Backward compatibility is a big deal. It would be a pain in the neck, if operating systems did not have backward compatibility. Testing and reconfiguration of hundreds of applications will kill you without backward compatibility at the OS level. If hardware did not have backward compatibility, then upgrade path would be strewn with problems. This is the reason most of the vendors follow a product life cycle approach, describe an upgrade path through various versions of a product and then finally sunset the product. What sunset means is that the next version of the product won't be backward compatible and you would have to go through the painful process of reconfiguration and testing. The key is that this pain should result in long-term value. This is what is promised when you go from Alpha architecture to Itanium architecture, from 32-bit to 64-bit, from Windows to Linux or Apple's Mac OS X. This is also called burning the bridge. You make a bold statement that you are not going back. You are going to start a new life. You are going to change all your applications and all the problems that came with it and configure new software with new business processes. This is very expensive. For a $50 million organization the cost could be $1 million, for a $500 million organization the cost won't be $10 million but more like $15 million and for a $5 billion organization the cost would be in the range of $250 million. The problem is that the expense increases with the size of the organization.
Why do we have this accelerated increase in expense as the size of organization grows? The problem is related to the cost of coordination, synchronization, business process optimization and testing. These are the diseconomies of scale. As you grow, you have to spend more. Clearly, our large organizations have many offsetting economies of scale, otherwise they will not be able to sustain themselves. In fact, this is not true in all the cases. I've seen several large organizations teetering on the brink of bankruptcy due to problems in upgrading of core systems.
From a purely software point of view, backward compatibility issues limit designers in pursuing the best course of action. So whether you should burn the bridges and burn the boat or not. It is as much a question related to philosophy as economics of risk and reward.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment