
Army Sniper Munition Optimization: Comprehensive Dispersion Response Analysis
Explore a study on optimizing the dispersion performance of the M118LR Sniper cartridge for enhanced accuracy in military operations. The research involved detailed statistical analysis and a custom split-plot design to meet the demands of new rifle specifications and improve target accuracy downrange.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
UNCLASSIFIED JMP Discovery Summit - October 19, 2017 Army Sniper Munition Optimization: A Comprehensive Dispersion Response Analysis Using a Custom Split-Plot Design, Loglinear Variance and Graphical Analyses Christopher Drake Lead Statistician, Small Caliber Munitions QE&SA Statistical Methods & Analysis Group Douglas M. Ray Lead Statistician, ARDEC QE&SA Statistical Methods & Analysis Group UNPARALLELED COMMITMENT &SOLUTIONS U.S. ARMY ARMAMENT RESEARCH, DEVELOPMENT & ENGINEERING CENTER DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED BACKGROUNDAND MOTIVATION The M118LR Sniper cartridge is a 7.62mm round used by snipers in the military for its enhanced dispersion performance. The need for the improvement in dispersion was largely driven by the need to meet the stricter requirement demands of a new compact sniper rifle. Using Monte Carlo simulation and prior data, it was estimated that the current M118LR cartridge with the new rifle would not meet the specification requirements for the rifle Lot Acceptance Testing (LAT) a larger than desired percentage of the time. This would lead to an unacceptable risk of failing lots of new weapons, which costs money, time, and leaves the Warfighter temporarily with less fielded compact sniper weapon systems. By improving the dispersion performance of the M118LR to meet the demands of the new compact sniper system, the Army would also be improving the dispersion performance across the board, decreasing the risk of missing targets downrange. Example 2 DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED TEST PLANNING To better scope and understand the work to be done to meet the test objective, a lengthy planning phase took place before the Dispersion Screening DOE. Some of the tools used include: Factor Brainstorming Input and Output Diagrams Fishbone (Ishikawa) Diagrams Failure Mode and Effect Analysis (FMEA) Variation Mode and Effect Analysis (VMEA) Interaction Assessments Stakeholder Analyses Voice of Customer input Measurement Systems Analyses This crucial activity played a large role in the successful execution of the DOE, and was leveraged in many ways during the test design. Sum X1 X2 X3 X4 X5 X6 X7 X8 X9 X10 X11 X12 X13 X1 1 1 1 2 0 1 1 1 0 1 2 2 13 X2 1 0 0 0 0 0 3 2 1 3 1 1 12 X3 1 0 0 0 0 0 0 2 2 0 3 1 9 X4 1 0 0 0 1 0 2 0 0 0 1 1 6 X5 1 0 0 0 2 1 0 0 0 1 2 0 7 X6 0 0 0 1 2 1 1 0 0 1 1 1 8 X7 1 0 0 0 1 1 0 0 0 0 0 0 3 X8 1 3 0 2 0 1 0 1 1 2 1 1 12 X9 1 2 2 0 0 0 0 2 1 1 1 1 10 X10 0 1 2 0 0 0 0 2 1 1 1 1 8 X11 1 3 0 0 1 1 0 2 1 1 1 1 12 X12 2 1 3 1 2 1 0 1 1 1 1 1 14 X13 1 1 1 1 0 1 0 1 1 1 1 1 10 3 DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED DESIGN METHODOLOGY The factors for the Dispersion Screening DOE were somewhat complex, containing various mixed level factors, a hard to change factor, and 13 total factors (relatively large). With classical screening designs, having mixed level and hard to change factors would not be possible. The Optimal Design is the best type of design for this style problem, as it shines when handling these challenges, and allows for a highly customizable environment to best conform to the test limitations and objectives. Optimal designs are modern computer generated designs in which the test points are placed depending on some optimality criterion selected, and the desired model specified. Optimal Designs have seen growing favorability in recent years due to their flexibility and comprehensive nature, coupled with readily available computing power. General Factorial 3x3x2 design 2-level Factorial 23design X3 = Low X3 = High Disallowed test region High 1 2 1 Medium 2 1 2 X2 2 1 1 Low Optimal Design Fractional Factorial 23-1 design Response Surface Central Composite design IV-optimal Low Medium High X1 Constrained Test Regions 4 DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED DESIGN METHODOLOGY CONT. The optimality criterion chosen for the dispersion optimization test was D-Optimality , a common criterion used in screening designs. This criterion aims to maximize the determinant of the information matrix (? ?) of the design. For the dispersion screening test, a split plot design was necessary due to the presence of a hard to change factor. In this case, we seek to maximize ? as defined in the equation below, where ? is the model matrix and ? 1 is the block diagonal covariance matrix of the responses (Goos and Vandebroek, 2002). ? = det[? ? 1?] Two other popular criteria include I-Optimality (used for response surface designs, higher order designs) and Alias Optimality (alternative to D-Optimality) which seeks to minimize the confounding and aliasing of model terms. 5 DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED DISPERSION CONSIDERATIONS Dispersion responses are inherently more noisy and difficult to estimate due to their highly variable nature. Each data point requires a grouping of test events to characterize the response. There are many different dispersion metrics used in practice to characterize spread, some of which are more information rich than others. Some widely-used dispersion metrics include Extreme Spread, Mean Radius, Radial Standard Deviation, and Vertical and Horizontal Standard Deviation. With the raw data points (X and Y coordinates) from each shot one can calculate any of these metrics, so gathering this raw data preserves all of the information that could possibly be pulled from the data. The two metrics that would be used most frequently in the analysis phase of this test would be Mean Radius (summarized REML model) and Vertical and Horizontal Standard Deviation (Loglinear Variance model), metrics known for their information rich qualities. Mean Radius Formula Vertical and Horizontal SD Formula Mean Radius Visualized Normal Bivariate Distribution 6 DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED DISPERSION CONSIDERATIONS CONT. With regard to sample size per grouping for summarized data, one can consider the relative precision versus group size, with larger group sizes giving more precise estimates. Although 20 round groups were desired initially, test sample limitations would require smaller group sizes for the Dispersion DOE. It was determined that 15 samples per group were adequate based on work from the Grubbs pamphlet for dispersion responses. (Grubbs, 1964) There are diminishing returns with regard to increasing shot group size of a dispersion response, with the benefit of shooting more than 15 rounds being minimal as can be seen in the figure below. Relative Precision Shots per Group 7 DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED TEST CONSTRAINTSAND SPLIT PLOT DESIGNS Due to logistical limitations, the barrel/suppressor factor was of particular concern because of its hard to change nature. When a factor is hard to change, it restricts our ability to fully randomize the test, rendering us unable to adhere to the basic fundamental test design principle of randomization. For this situation, Split Plot designs are utilized, which aim to balance the number of changes to the hard to change factor with some randomization via the use of Whole Plots. These Whole Plots are groupings of the hard to change factor settings of a certain size that can be controlled largely based on the difficulty of factor changes. These groupings, which lessen the amount of total changes required, are then randomized to try to mitigate nuisance variable confounding concerns. The smaller the whole plot sizes are, the closer one approaches full test randomization, improving the overall statistical power and alias structure of the design. As a result, Split Plot designs are generally more complicated to analyze and less efficient than designs without randomization constraints. Whole Plots in a Test Matrix Split Plot Visualized (Jones and Nachtsheim, 2009) 8 DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED ITERATIVE TEST DESIGN PROCESS Knowing that an Optimal Split Plot design would be the framework of the design chosen, we created and compared multiple designs that fit this framework in an attempt to select to most powerful and efficient design (fewest samples) that still met the test objective. This was accomplished using JMP 13 Pro statistical software s Custom Design platform, with an Optimality Criterion of D-Optimality selected. When comparing the designs the focus was mainly on power, aliasing, and total sample size required. for power, all main effects and a handful of two factor interactions would need to be above approximately 0.85 for aliasing, an alias structure that did not show any significant confounding was required. When assessing model term power with a noisy dispersion response, higher than usually required powers should be favored to mitigate risk of missing a signal. (this can be done in JMP by inflating the anticipated RMSE) The final design chosen was a D-Optimal design with 15 whole plots and 45 total configurations as it had the smallest sample size with good power and aliasing. Power Analysis Confounding Analysis 9 DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED DATA ANALYSIS Before any statistical model building, an initial exploratory data visualization took place. During this initial investigation, there were a few interesting relationships observed, which helped guide and validate the model building process. An example of an interesting graphical relationship found between Propellant and the response is pictured below. After the data cleansing and visual data exploration activities, both the summarized data (sub-groups characterized by mean radius) and raw data (X and Y coordinates) were analyzed separately beginning with a response data transformation. This transformation was needed because there is a physical limitation for dispersion (no negative numbers feasible), and the values are generally close to this limit, making the data and residuals non-normal. In order to rectify any residual diagnostic issues and make the data appear normal for analysis purposes, a log transform was applied throughout. Propellant Dispersion Visualization Non-transformed Residuals Transformed Residuals 10 DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED DATA ANALYSIS CONT. Main effects models were generated and reduced using Standard Least Squares Regression (SLSR) with the Restricted Maximum Likelihood (REML) method (REML being used due to the split plot random effect of whole plot), eliminating the factors that appeared insignificant based on statistical and practical significance. From there, two-factor interactions were added when possible, crossing the most significant main effects, leveraging the concept of Effect Heredity while also leveraging the N^2 Diagram from the planning phase. An alternative to the SLSR analysis method for this type of data (variation response) is the LogLinear Variance method. This method uses the raw coordinates rather than summarized metrics to create a model for mean and standard deviation in the X and Y dimensions separately. Although usually a better representation of dispersion data (preserves more information), the LogLinear Platform in JMP cannot yet handle split plot designs, as they lack the ability to model random effects. Main Effects Only Model for 100/300y Combined Loglinear Variance ME Model for 100/300y Y Dimension 11 DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED DATA ANALYSIS CONT. In our case, the Variance Components Estimate for the Whole Plots random effect appeared insignificant, allowing us to use the more powerful (with regard to dispersion analysis) Loglinear Variance personality, which does not currently allow for the modeling of random effects. Various models were created for main effects and some interactions, using Loglinear Variance, and these models were compared to the REML models. Both models largely agreed with respect to the model term significance and estimates. REML Variance Component Estimates for Various Models 12 DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED INTERPRETATION With the final reduced empirical models in hand, estimations could now be made for any potential configuration of the significant factors. To help management grasp the extent to which the new optimized M118LR configuration out performed the current fielded munition, multiple Monte Carlo simulations were run using the empirical model estimation outputs. Normal Bivariate Distribution 30 This simulation used a Bivariate Normal distribution were y= x to approximate the dispersion performance of the new and current configurations, overlaid them on an E-Type silhouette at 1000 yards, and computed the probability of hit for a single round of each configuration (shown right). 25 20 15 10 5 0 -5 The results were staggering, showing a drastic potential improvement in the new optimized configuration with a probability of hit at close to 100% versus the current configuration p-hit of 76%. -10 -15 Current M118LR Optimized -20 -25 -30 P-Hit Calculations 13 DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED INTERPRETATION CONT. Although this potential improvement shows promise, in the real world when we incorporate cost of implementation and recurring costs, the decision becomes much more complex. In this environment we usually need to select a few best value configurations to more forward for more testing. To find these best value configurations, many engineers, program managers, and manufacturing SMEs constructed estimates for one time costs and recurring costs for each of the 45 configurations tested. With this cost coefficient we can create a Pareto Frontier (shown right), which considers the percent dispersion improvement over the current M118LR and the costs involved. We then consider only the configurations that are above a certain threshold of percent improvement to make it worth our while (25% in this case), while also being the cheapest to implement. This forms the Frontier along the bottom right front, which helps identify the possible best value options to move forward with. From here, the IPT decides what further testing needs to be completed for validation and user evaluation. 14 DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED CONCLUSIONS Using statistically valid and defensible test design methodologies founded in DOE principles, the IPT was able to produce a highly efficient Optimal test to screen many design factors for the dispersion improvement of the M118LR Sniper Cartridge. With classical test design techniques of the past, testing such a complex set of factors may have been impossible or infeasible from a sample size perspective. With this modern, flexible, and lean test design, the IPT was able to generate the data required to meet the test objective in an efficient and defensible manner. Analyzing this data with modern software and cutting edge statistical techniques such as SLSR with REML, and Loglinear Variance allowed for the clear and concise interpretation of the collected data. Moving forward, the work produced from this effort has provided enough evidence to select optimal configurations considering cost and performance benefits, and future work will look to validate these results with small quantities of additional testing. From this future work, the M118LR should see design modifications that will improve the dispersion performance of the round, enabling our Warfighters to be more precise with their sniper rifles than ever before. 15 DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED REFERENCES 1. Grubbs, F. E., Ph.D., Statistical Measures of Dispersion for Riflemen and Missile Engineers , (1964). 2. Montgomery, D., Design and Analysis of Experiments 8thed. , J. Wiley Print, Hoboken, New Jersey (2009). 3. Montgomery, D., Statistical Quality Control 7thed. , J. Wiley Print, Hoboken, New Jersey (2013) 4. Jones, B., Nachtsheim, C., Split-Plot Designs: What, Why, and How , Journal of Quality Technology, Vol. 41, No. 4, October 2009 5. Goos, P., Vandebroek, M., Optimal Split-Plot Designs , J. Qual. Technol. (2001;33). 16 DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED QUESTIONS ? 17 DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.