We present an overview of our software automation and scientific data management efforts and discuss frameworks to address the problematic issues of very large datasets and varied formats utilized during seismic calibration research. The National Nuclear Security Administration (NNSA) Ground-Based Nuclear Explosion Monitoring Research and Engineering (GNEM R&E) Program has made significant progress enhancing the process of deriving seismic calibrations and performing scientific integration with automation tools. The technical underpinnings of our software program, called Guided Interactive Statistical Decision Tools (GiSdT), are described. In addition to addressing these quantitative technical components, the GiSdT program is aimed at facilitating stakeholder involvement so that the perspectives, values and objectives are all relevant stakeholders are addressed explicitly in the decision making process. GiSdT is an open source interactive framework program that provides the tools needed to address the relevant components of a sustainability-based decision problem, including describing the decision landscape, translating that landscape into goals and objectives, valuing the measurable attributes that describe each objective, identifying decision options and addressing the uncertainty in the attributes (e.g., human health risk, institutional controls) through appropriate probabilistic models. To identify and develop these methods and approaches, GiSdT provides the tools needed for decision-makers and stakeholders to understand and characterize their knowledge of their current decision-making processes, and what methods and approaches they need to pro-actively and quantitatively address all aspects of sustainability in their decisions. The goal of GiSdT is to provide that access by identifying or developing effective and user-friendly decision methods and approaches that empower decision-makers to explicitly and routinely incorporate all aspects of sustainability into their decision-making. Most decision-makers do not currently have access to useful or usable methods and approaches when are presented with choices that have significant impacts across all three pillars of sustainability. The purpose of GiSdT (Guided Interactive Statistical Decision Tools) is to provide a quantitative framework whereby all aspects of a decision problem, such as remediation and waste management decisions, can be addressed quantitatively, and hence, defensibly, transparently and traceably. They lack technical defensibility, transparency and traceability. Although efforts are often made to include factors across all three pillars (for example, for decisions made more » under CERCLA), these efforts are usually qualitative, and hence difficult to defend. In addition, such decisions need to conform to regulatory or other legal requirements, which often constrains the decision space of interest. Sustainable decisions need to be made based on all three 'pillars of sustainability' (economics, environment and social), and require understanding and characterization of the costs and values associated with each pillar. They are made instead based on quantitative metrics of human health risk. The vast majority of remediation and waste management decisions are made without quantitative consideration of economic and socio-political factors. We illustrate the use of Tessera with an example analysis of computer network = , Tessera automatically manages the complicated tasks of distributed storage and computation, empowering data scientists to do what they do best: tackling critical research and mission objectives by deriving insight from data. To address this need, Pacific Northwest National Laboratory developed Tessera, an open source software suite designed to enable data scientists to interactively perform their craft at the terabyte scale. Existing commercial software solutions often lack extensibility and the flexibility required to address the nuances of the demanding and dynamic environments where data scientists work. They require flexible tools that make it easy to rapidly reformat large datasets, interactively explore and visualize data, develop statistical algorithms, and validate their approaches-and they need to perform these activities with minimal lines of code. Data scientists are often on the “front-lines” of making sense of complex and large datasets. Extracting useful, actionable information from data can be a formidable challenge for the safeguards, nonproliferation, and arms control verification communities.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |