Programming productivity
From Wikipedia, the free encyclopedia
Programming productivity refers to a variety of software development issues and methodologies affecting the quantity and quality of code produced by an individual or team. Key topics in productivity discussions have included:
- Amount of code that can be created or maintained per programmer (often measured in source lines of code per day)
- Detecting and avoiding errors (through techniques like six sigma management, zero defects coding, and Total Quality Management)
- Software cost estimation (cost being a direct consequence of productivity)
The relative importance of programming productivity has waxed and waned along with other industry factors, such as:
- The relative costs of manpower versus machine
- The size and complexity of the systems being built
- Highly publicized projects that suffered from delays or quality problems
- Development of new technologies and methods intended to address productivity issues
- Quality management techniques and standards
An extensive literature exists dealing with such issues as software productivity measurement, defect avoidance and removal, and software cost estimation. The heyday of such work was during the 1960s-1980s, when huge mainframe development projects often ran badly behind schedule and over budget. A potpourri of development methodologies and software development tools were promulgated, often championed by independent consultants brought in as troubleshooters on critical projects. The U.S. Department of Defense was responsible for much research and development in this area, as software productivity directly affected large military procurements.
In those days, large development projects were generally clean-sheet implementation of entire systems, often including their own system-level components (such as data management engines and terminal control systems). As a result, large organizations had enormous data processing staffs, with hundreds or thousands of programmers working in assembly language, COBOL, JOVIAL, Ada, or other tools of the day.
Modern computer use relies much more heavily on the use of standardized platforms and products, such as the many general-purpose tools available today under Linux and the Microsoft operating systems. Organizations have more off-the-shelf solutions available, and computer use is a basic job requirement for most professionals. Tasks that once would have required a small development team are now tackled by a college intern using Microsoft Excel. The result has been a trend toward smaller IT staffs and smaller development projects. With larger projects, techniques like rapid prototyping have shortened development project timelines, placing a priority on quick results with iterative refinement. Traditional programming-in-the-large has thus become rare – the domain of industry giants like Microsoft and IBM. As a result, although programming productivity is still considered important, it is viewed more along the lines of engineering best practices and general quality management, rather than as a distinct discipline.
A need for greater programmer productivity was the impetus for categorical shifts in programming paradigms. These came from
Speed of code generation Approach to maintenance Emerging technologies Learning curve (training required) Approach to testing
[edit] References
- Software Cost Estimation with Cocomo II, Barry W. Boehm et al., Prentice Hall, 2000. ISBN-13: 978-0130266927.
- Developing Products in Half the Time: New Rules, New Tools, Preston G. Smith and Donald G. Reinertsen, Wiley, 1997. ISBN-13: 978-0471292524
- Programming Productivity, Capers Jones, Mcgraw-Hill, 1986. ISBN-13: 978-0070328112
- Estimating Software Costs, Capers Jones, McGraw-Hill, 2007. ISBN-13: 978-0071483001