By Sebastian Sprenger
August 14, 2013
Defense Department officials should take greater care to ensure their programs are tested under preplanned, realistic conditions, skipping crude data analysis techniques in favor of more advanced ones to ensure better insight into system performance, according to a report prepared for Pentagon chief weapons tester J. Michael Gilmore.
The document, submitted last month by Catherine Warner, Gilmore’s science adviser, notes “some cases” in which tests deviated from conditions described in test plans. “This has the effect of reducing the efficiency or limiting the conclusions that can be made from the subsequent data,” it states.
Warner’s report praises the services’ test agencies for improving experimental test designs in recent years, but urges them to employ advanced statistical methods to analyze results. “The simplest case of this is where a test is designed to cover all or many of the important operations conditions, and is optimized to be extremely efficient in the number of iterations in each condition, but the data analysis is limited to reporting a single average (mean) of the performance across all the test conditions,” the report adds.
A “rigorous analysis” using advanced statistical methods would lead to greater confidence about system performance levels. That information, in turn, is key to making acquisition decisions, according to Warner’s report.
To read the full column behind the paywall, click here