October 4th
Welcome and Plenary Session (8:00-9:00AM)
The Impact of Statistical Engineering in Aerospace Research and Development at NASA
Peter Parker, NASA
While statistical methods have been widely employed in industrial product and process improvement to increase productivity, quality, and profitability for almost 70 years, their routine application at NASA is relatively recent and still emerging. Aerospace research and development projects benefit from the application-focused statistical engineering strategy that leverages statistical tools to accelerate learning, maximize knowledge, ensure strategic resource investment, and inform data-driven decisions. In this presentation, a survey of statistical engineering at NASA is provided through a collection of micro case studies in aeronautics, space exploration, and atmospheric science. Throughout, significant milestones in technical and organizational impact, educational resources, methodological extensions, and acceptance by the aerospace community are highlighted. Building on this strong foundation of demonstrated benefits, statistical engineering has established a foothold in the climb toward more ubiquitous application of innovative statistical methods at NASA.
Luncheon (12:15-1:45PM)
Reliability of Eyewitness Identification as a Forensic Tool
Karen Kafadar, ASA President, Dept. of Statistics, University of Virginia
ABSTRACT: Among the 318 wrongful convictions identified by the Innocence Project that were later overturned by DNA evidence resurrected from the crime scene, 229 (72%) involved eyewitness testimony. Such courtroom identifications from an eyewitness can be tremendously powerful evidence in a trial. Yet memory is not a perfect video recording of events, and one’s recollection of the events surrounding an incident is even less reliable. In October 2014, the National Academy of Sciences issued a landmark report evaluating the scientific research on memory and eyewitness identification. The Committee, comprised of researchers (psychologists, statisticians, sociologists) and representatives of the judicial system (judges, attorneys) reviewed published research on the factors that influence the accuracy and consistency of eyewitnesses’ identifications, conducted via laboratory and field studies. I will describe the research on memory and recollection, the shortcomings in the statistical methods used in evaluating laboratory studies, and the Committee’s recommendations aimed at standardizing procedures and informing judicial personnel of the factors that can have negative impacts on accuracy of eyewitness testimony. (The speaker was a member of the NAS Committee that issued the report.)
Youden Address (4:00-5:00PM)
The Role of DEX & EDA for Standards and the Role of Standards for DEX & EDA
Jim Filliben, Statistical Engineering Division, ITL/NIST
ABSTRACT: Churchill Eisenhart–the founder of the Statistical Engineering Division at NBS/NIST in 1947–gave the Youden address at the FTC in 1975. As SED’s inaugural director, the first person that Churchill hired was a bright, young chemist named Jack Youden. Inspired by Youden’s experimental prowess, his love of applications and problem-solving, and his prolific methodology contributions & writings, DEX/DOE (Design of Experiment) quickly became an essential component in what the SED statistical consultant offers the NIST researcher in their quest for the usual scientific R4IP: rigor, repeatability, reproducibility, robustness, insight, and predictability.
We discuss the role of DEX and EDA in the production of NIST standards (and in particular, for NIST SRMs (Standard Reference Materials))–a primary institutional deliverable which is critical for calibration in science and industry. Conversely, we also discuss the role of standards in the development and application of DEX and EDA methodology. We describe a general standardized approach/framework for statistical problem-solving, and then present specific standardized EDA methodologies which have been constructed and are routinely used post-DEX for Interlab, Comparative, and Sensitivity/Screening problems here at NIST–with obvious applications beyond to general science, engineering, and industry.
October 5th
Luncheon (11:45AM-1:15PM)
Detective X and New Insights on the Trial of the Century: Forensic Science in The State of New Jersey v. Bruno Richard Hauptmann (1935)
John Butler, NIST
ABSTRACT: On the dark and stormy night of March 1, 1932, the 20-month-old son of aviator Charles Lindbergh was kidnapped from his crib near Hopewell, New Jersey while he slept in his upstairs bedroom. Forensic evidence involved a crude ladder left behind and handwriting on a ransom note demanding $50,000 for the child’s return. The investigation, which was led by the New Jersey State Police (NJSP), was aided by individuals from several federal agencies including the Federal Bureau of Investigation (FBI), the United States Department of Agriculture (USDA), and the National Bureau of Standards (NBS). The previously underappreciated role of a physicist from NBS named Dr. Wilmer Souder will be described based on memos obtained from the NJSP Archives and photographs uncovered from the National Archives. In September 1934, a German immigrant named Bruno Richard Hauptmann became a suspect in the Lindbergh baby kidnapping. Hauptmann’s six-week trial in early 1935 was reported on around the world and dubbed “The Trial of the Century.” The role of forensic evidence in the trial will be discussed as well as the impact of this case on the growth of early forensic science laboratories in the United States.
Reception with SPES Special Panel Session (3:15-5:15PM)
Training the Next Generation of Statisticians: What do They Need to Know?
Panel discussion: Will Guthrie (NIST), Richard Warr (BYU), Ruth Hummel (JMP), and Jennifer Kensler (Shell)
ABSTRACT: Traditionally, training for statistics jobs in industry and government has included traditional topics like experimental design, statistical process control, and reliability. But in the current era of data science, are these classic topics still relevant? Or should they be replaced by training in machine learning, data visualization, and programming in languages like R and Python? Or should they have both? This panel will discuss these questions from both the perspective of the consumers (industry/government) and producers (academia) of these students.