Browse Documents (abstracts & extracts)

The success of technology-based projects comes about from the availability of applicable technologies, the skills, knowledge and attitudes of the people performing the project, and how these people go about performing the project (the processes which they use). Three key capabilities required by organizations for successful performance of most technology-based projects are those of project management, systems engineering and software engineering.

The process maturity (process quality) for each of the above processes within an organisation may be measured. Each measurement may be based on a Capability Maturity Model (CMM) for the process. The CMM serves as a process reference model for processes corresponding to differing levels of process capability. The CMM is used in association with an assessment method.

CMMI Briefing

Capability Maturity Model Integration (CMMI): A Path to Improved Systems

Joe Jarzombek, PMP
Deputy Director for Software Intensive Systems
Acquisition Resources and Analysis Directorate
Office of the Under Secretary of Defense (AT&L)
Building upon Standards To Guide Process Improvement

CMMI Tutorial

CMMI© Tutorial

INCOSE-WMA, Oct 2002

The following specification is for the CMMI Product Suite. The specification defines the scope, lists applicable documents, defines the requirements the CMMI Product Suite must meet to be considered acceptable, identifies the methods for verifying achievement of the requirements, provides packaging information and general notes.


EIA, INCOSE, and EPIC Announce the Release of EIA/IS-731, Systems Engineering Capability Model

A cooperative effort of three organizations, intent on the improvement of the Systems Engineering discipline, has resulted in the issuance of a new Interim Standard for defining, improving and assessing Systems Engineering capability. The three organizations involved in developing the new standard are the Government Electronic and Information Technology Association (GEIA) of the Electronic Industries Alliance (EIA), the International Council on Systems Engineering (INCOSE), and the Enterprise Process Improvement Collaboration (EPIC).

EIA-731 Systems Engineering Capability Maturity Model Standard

This document was developed by a working group with representation from the following organizations :

- Electronic Industries Alliance (EIA

- International Council on Systems Engineering (INCOSE)

- Enterprise Process Improvement Collaboration (EPIC).

Designed to help organizations improve their practice of systems engineering through self-assessment

Is Compatible with EIA-632 & IEEE 1220 standards on SE

Provides reference for comparing actual systems engineering practices against essential elements

Encompasses all phases of the system life cycle and focuses on process characteristics

Free download, can purchase hard copy

A Standardized Approach to IT Performance

Government and industry have the need to assess the maturity of their internal software acquisition processes. The purpose of assessing the maturity of organizations' software acquisition processes is to identify areas needing improvement. In order for organizations to make improvements, they must know the ultimate goal and what is required to achieve that goal. Additionally, progress toward achieving the goal must be measurable. A capability maturity model provides the framework needed to facilitate the desired improvement. The Software Acquisition Capability Maturity Model (SA-CMM) has been developed to provide such a framework.


This work is the product of a collaborative effort by various organizations within government, industry, and academia. This document includes many excerpts from “A Systems Engineering Capability Maturity Model, Version 1.1,” CMU/SEI-95-MM-003, published in November 1995.

CMM Practices

This document was produced as part of the revision of CMM Version 1.0, which was released in August, 1991.

This Manual is issued under the authority of DoD Directive 5000.59, DoD Modeling and Simulation (M&S) Management, January 4, 1994. Its purpose is to prescribe a uniform glossary of modeling and simulation (M&S) terminology for use throughout the Department of Defense. In addition to the main glossary of terms, this manual includes a list of M&S related abbreviations, acronyms, and initials commonly used within DoD.



NOTE: The following acronyms and abbreviations are used by system acquisition managers within the Department of Defense (DoD). The majority of those dealing primarily with the management of the acquisition process are defined in Appendix B, Glossary of Terms. Those that refer to Service unique titles and organizations are not further defined.



Subject: [Iol-news] What's New on INFORMS Online, February 3, 2003
Date: 4/2/03 4:51 PM
Received: 4/2/03 7:22 PM
From: Brian Borchers,

What's New on INFORMS Online, February 3, 2003:

Read Pate-Cornell and Fischbeck, "Risk Management for the Tiles of the
Space Shuttle," Interfaces 24(1), Jan-Feb 1994, 64-86.

OR/MS Tomorrow (student newsletter) Student Paper Competition deadline
extended to Feb 15, 2003. $100 prize for the winner, to be published
in OR/MS Tomorrow.

Visit the INFORMS 50th Anniversary Store.

Bid for your autographed copy of the 50th Annivarsary Issue of
Operations Research on eBay. Bidding closes 1:00am EST Feb 7.

Conference pages are up for the May, 2003 Practice Conference.

Conference pages are up for EURO/INFORMS, July, 2003.

Conference pages are up for the Atlanta Annual Meeting, October, 2003.

There are numerous updates to the prizes page, including calls for
nominations and winners.

Iol-news mailing list

The SRE mailing list aims to act as an online forum for exchange of ideas among the requirements engineering researchers and practitioners.

This moderated list is a free service which is offered by the CSIRO-Macquarie University Joint Research Centre for Advanced Systems Engineering (JRCASE) at Macquarie University, Sydney. The list manager and moderator is Dr Didar Zowghi.

Lessons Learned as Compiled by Jerry Madden , Associate Director of the Flight Projects Directorate at NASA's Goddard Space Flight Center: (Jerry collected these gems of wisdom over a number of years from various unidentifiable sources. They have been edited by Rod Stewart of Mobile Data Services in Huntsville, Alabama.). January 1, 1995. Updated July 9, 1996.



A. The Guide: What It Is and Is Not ‑

This guide was developed for the Naval Air Systems Team in recognition of the need to:

Provide a single consolidated overview of the major internal NAVAIR Team acquisition processes.

Provide a quick, ready reference identifying the major reviews, approval, and documentation requirements associated with the acquisition process.

Provide helpful advice from our "corporate memory" to Program Managers (PMs) and their Integrated Program Teams (IPTs), and team members that are new to the process.

Provide a list of key acquisition experts and process managers to assist the PMs/IPTs through the acquisition process.

The following points represent what this guide is not intended to do:

It does not supersede existing Notices, Instructions, Directives or established DoD/DoN/NAVAIR Team policy on the acquisition process.

It does not describe every activity and/or document required in managing a program within the NAVAIR Team.

It is not a "cookbook" approach to our acquisition process. The uniqueness of each acquisition program precludes such an approach.

B. The Guide ‑ Its Purpose

The systems acquisition and life cycle management process for the development, production, and support of weapons/systems to satisfy the needs of the Fleet is extremely complex and lengthy. There are numerous interrelated Department of Defense and Navy directives and implementing instructions detailing each part of the process.

The purpose of this NAVAIR Team Acquisition Guide is to "pull together" the activities and critical documentation required and put these requirements in a concise, maintainable, and easy-to-use format to help our PMs/IPTs plan ahead. The need for PMs, IPT leaders, and their attendant team members, particularly new members, to know the process and sequence of events and average time cycle to complete events is essential for planning their programs and ensuring timely obligation/expenditure of funds budgeted. In addition, by seeing the entire process, our NAVAIR leadership can focus on better ways to manage that process by establishing time limits for each part of the acquisition process and minimizing the number of process events, and monitoring system performance measurement against the established process standards


In the years following World War II, the United States entered a period of technological competition with the then Soviet Union called the Cold War. It was a classic quality versus quantity confrontation. The Soviets designed and built tough, technically simple, iterative systems that could be produced in large numbers. The United States usually chose the latest technological solution and relied on projected higher “ kill ratios” to prevail in combat even if the confrontations were between Soviet and U.S. Third-World clients. By the middle of the 1960s, a terrible truth was obvious about the U.S. commitment to high technology. Our systems were fragile, expensive to support, and short-lived when employed. The F-111 aircraft was the classic example. Brilliant in concept, it was formidable on the rare occasion when everything worked and lasted for the duration of a mission. The amount of equipment and number of personnel required to support that aircraft and the support costs involved were shocking. A new philosophical approach was definitely required.

The philosophy was simple to state: Influence the design of a system from its conception so that support was considered and life-cycle costs minimized. The implementation was more difficult. The iterative nature of the design and manufacturing process created disciplinary “ stovepipes” that resisted the intrusion of support considerations on design, and the logisticians lacked an effective tool-set to credibly present their arguments. Intuition wasn’t good enough.

The original operational test orientation of this research is noted in the Background paragraph of Chapter 1. To accomplish the original objectives it was necessary to acquire considerable data on the cost, schedule and performance success of programs within the Engineering and Manufacturing Development (EMD) phase of development. The current research used the original methodology and expands the results by including more recent programs and additional parameters. Hence this report contains program cost, schedule, and performance results by Service, and within three separate year groupings. It also contains a comparison between the Services’ Operational Test Activity (OTA) test report and the Director Test and Evaluation (DOT&E) independent evaluation of the Service test report.


This publication presents the results of an intensive 11-month program for three military research fellows. The Under Secretary of Defense (Acquisition and Technology) (USD(A&T) chartered this fellowship program in 1987. The program brings together selected officers from the Air Force, Army, and Navy for two primary purposes: first, to provide advanced professional and military education for the participating officers; and, second, to explore new and innovative concepts that will enhance the Department of Defense acquisition community.

The fellowship program, managed by the Defense Systems Management College (DSMC), is conducted in three phases. In the first phase, the three officers meet at DSMC for four weeks to begin to determine their research goals, define a research plan, initiate background research, and consult with the DSMC faculty. In the second phase, the fellows attend the Program for Management Development at Harvard Business School. This comprehensive 11-week executive education program brings together functional-level executives and new general managers from as many as 39 countries to learn the state-of-the-art management techniques and technologies necessary to become successful general managers in today’s global marketplace. In the third phase, the fellows return to DSMC to conduct their joint research, culminating in the publication of their research report.

This report identifies a path for the leadership of the Department of Defense Acquisition System to follow for implementing successful acquisition reform. It is intended to serve as a primer for changing organizations, and includes lessons learned from the perspective of implementing change. The report presents a model for change based on academic understanding of and industry practices for organizational change. In developing the model, we looked at the latest Department of Defense acquisition reform effort, and addressed what the Department of Defense can do to improve the change process. We analyzed how organizations, within both the military and industry, have successfully led change and determined what could be learned from those organizations. The model is designed to assist program managers and senior leadership in implementing change in Department of Defense organizations.