When I started programming in the 1960s, application development was predominantly an engineering discipline done by a handful of math/science-oriented “professionals”. Most of the early applications we built were foundational to the business, and as such required reliability as a major component. Estimation of effort, time and cost was more an art than a science for most projects and the business user was willing to live with those constraints as long as the resulting solution reasonably met his or her most important needs. Interestingly, in most organizations of which I was part, requirements were generally gathered in an informal/ad hoc manner and loosely followed a design-code-test-implement set of concepts rather than some formal AD methodology.
Jump forward about 15 years and we were still pretty much in the same boat in terms of engineering new (or enhancing existing) foundational applications. But, due to all the project failures we began to see most organizations implement more guidelines for AD processes and methods. Since reliability was still the predominant issue, “waterfall AD methodologies” like Yourdon, Information Engineering, etc, were more commonly used to ensure a more consistent result. But, of course, the additional architectural and analysis steps also tended to made these somewhat slower to solution delivery than previous “design and code” approaches. Some of this was overcome through the use of complementary enterprise-4GLs for simpler or more ad hoc development needs.
Fast forward to today’s environment with more widely IT-literate application developers and users. Time is money and money is limited. Business and application agility generally trumps reliability. And, making things worse, all those foundational applications are now interfaced/integrated in dozens of ways making change management problematic. We have moved from engineering to managing a “solution architecture ecosystem” where application development is having to deal with the biological impact of change on those foundation applications, along with those we have bought, are part of the supply chain, or increasingly exist “in the cloud”.
We have new “agile AD methods” which are being used to extend those legacy and purchased applications and data through new browser/portal technologies using concepts like mashups and compositions. But not everyone is being so successful with agile methods on all their projects. We are seeing “basic agile methods” used enterprise-wide on “do it quick” projects – many of which might be highly innovative and provide important business value. However, we are also seeing many failures on the agile projects where iterations of delivery (i.e. improving functionality and reliability over time) are insufficient to meet “enterprise-class” needs for quality, coordination and integration. in some cases organizations have erroneously backed off the use of agile methods completely, even where they should be the most appropriate choice.
What’s needed is a segmentation of project methods into at least 3 categories; 1) the more self-contained simpler applications using basic agile AD methods, 2) waterfall AD methods for highly inter-dependent and complex applications, and 3) an emerging hybrid of the two which we at Gartner refer to as “enterprise-class agile AD” for projects whose needs are a greater balance of “do it quick and get it right”.
Best Practice: Make multiple “flavors” of AD methodologies available to your projects and select the one which best matches your needs based on a variety of factors including the requirement for speed versus reliability. Agile methods ought to be part of that portfolio of methodological options.
Note: Gartner customers can read more about this topic and the factors to consider in “Enterprise-Class Agile Development Defined”, “Enterprise-Class Agile Development as Part of the Enterprise Solution Architecture” and “Selecting the Best Project Approach to Application Development and Composition”.
Category: Uncategorized Tags: