Conclusions
We discussed here the context of nuclear data for applications, in particular for the production of energy. To overcome today’s challenges (safety, sustainability, waste management), new design, or fuel cycle are under study. The development of these new concepts that will use different isotopes, or faster neutrons, relies essentially on numerical simulations. These simulations use, as inputs, evaluated databases that list nuclear reaction cross section, angular distribution, particle spectra, … The evaluations are built on experimental data and theoretical models. However, for the isotopes and neutron energy ranges consider for next generation of reactors, the current state of the evaluations is not enough to ensure an accurate simulation of reactors parameters. This is because the models and experimental data still present large uncertainties for these isotopes and neutron energy. The necessary improvement of nuclear data evaluation, in order to meet the required accuracy in reactor simulations, will therefore be achieved by a combination of new, precise, experimental measurements, and improvement of reaction models (through theoretical refinement, and/or by constraining the model’s parameters with experiments).
The DNR group at IPHC has decided to focus on inelastic neutron scattering. These reactions are important in a reactor, as they modify the neutron energy, and change the neutron population and create new isotopes (in the case of \((\text{n},~x\text{n})\) with \(x\leq 2\)). The impact of these reactions on reactor parameters (power, reactivity, …) has been demonstrated in sensitivity studies. The reaction models describing the inelastic scattering can benefit from new, precise measurements, as only a few experimental data is currently available to adjust the models.
With the Grapheme setup, installed at the Gelina facility in JRC-Geel, The \((\text{n},~\text{n}')\) and \((\text{n},~2\text{n})\) reactions were studied on the isotopes of natZr, nat,182,183,184,186W, 233,235,238U, and 232Th ; with other isotopes foresee in the near future. The experimental \((\text{n},~x\text{n}~\gamma)\) cross section obtained with Grapheme are compared to model predictions and used to improve the reaction codes and input parameters (level density, \(\gamma\) strength functions, …).
The tungsten isotopes are a good playing field for measurements and confrontation with models. The \((\text{n},~\text{n}'~\gamma)\) cross section for even-even isotopes have already been investigated. Adding measurements for 183W is a nice complement to the set, as it will allow the study of the unpaired nucleon, as well as thread the values from 182 to 184W. The current knowledge of \((\text{n},~x\text{n})\) reaction on this isotope is limited, and our experimental values will fill a gap in the sparse data on \((\text{n},~\text{n}')\) reaction and provide a valuable data set to constrain the models and parameters for the reactions code.
A very important thing when producing the \((\text{n},~x\text{n}~\gamma)\) cross sections is to produce precise results (ideally, the relative uncertainty should be around 5 % or less). The uncertainties associated with the cross sections should be fully documented (what is their sources, components, shape, …) and ideally come with covariance/correlation matrices.
There are several ways to produce uncertainties and covariance matrices. One is relying on the Monte Carlo (i.e. random sampling method). With a full Monte Carlo (MC) analysis, the analysis parameters are sampled randomly according to their given distribution, and the analysis is performed many times, with the results being collected. At the end of the iterations, a central value (mean) and parametric uncertainty (standard deviation) are computed to produce the result. The covariance can easily be obtained from the collection of iteration results.
A new analysis code for full MC analysis has been developed, and applied to the 183W data set recorded with Grapheme. The objective of this code is to rely on a pipeline of stages and tasks in order to automatize the analysis while reducing reprocessing files that did not change. The analysis is written so that it can be easily debugged, and deployed on different platforms. Finally, the ultimate goal is to have a flexible code base, that will be adapted to other data sets and experiments in the future (one thing that was not possible with a previous Monte Carlo code). As it is written now, the code can easily be adapted to any number of detectors, at different angles (not just the 110\(^\circ\) and 150\(^\circ\) combination used currently).
The detailed workflow of the analysis has been presented, in order to completely explain how the values and their uncertainties are obtained.
The preliminary results show compatibility with previous extraction from the same data set, and a limited agreement with models, as had been seen in previous studies (even-even W isotopes, 238U). For the first time, the correlation matrices have been obtained directly from the full Monte Carlo analysis for all studied \(\gamma\) rays. The correlation factors show, as expected, the strong correlation between points, driven by the target mass and HPGe detection efficiencies parameters. It could be enlightening to plot the correlation across all transitions together, rather than transition by transition.
Although in the current format only the angle integrated \((\text{n},~x\text{n}~\gamma)\) cross sections were produced as results of the full MC analysis, it is absolutely possible to extract other quantities from the analysis: angular cross section, correlation matrices between transitions (rather than correlation strictly within a transition).
The current way of presenting the results, with side-by-side experimental measurements plotted against model calculations, is efficiency in its simplicity, but somehow dry and lacks an overview aspect. Indeed, even in the data from one transition, there are different components to the cross section (direct peak, feeding from the continuum, …) and a global comparison may not be the best option: what if the shapes of the measured values is well reproduced by the models, but not the amplitude? What if the amplitude match, but the shape is overall different? Following the latest work of M. Dupuis, one could look at how each transition constrains the different model parameters.
The code can still be improved in order to refine the results (so far preliminary) and being able to deploy it for other data sets. A list of potential improvements has been established, so that clear deadlines and delivery dates can be set for the improvements. It is already foreseen that the analysis framework will be used (at least in part) for the analysis of 239Pu data recorded with Grapheme (the data set will include more HPGe detectors than in the 183W one, with among the extra channels, a 36-pixels detector), as well as experimental data that will be recorded at NFS.
The 183W results will complete the even-even isotopes ones. We expect a quick release of 184W \((\text{n},~2\text{n}~\gamma)\) cross section measurements after that. The whole experimental data will be published rapidly and made available to the public, as well as described in a data paper. The theoretical interpretation of the trans-isotopes 182, 183, and 184W \((\text{n},~\text{n}'~\gamma)\) and \((\text{n},~2\text{n}~\gamma)\) will follow in a scientific article.
To properly take into account the experimental data in the evaluation, the most complete description possible of the uncertainties and the way they were estimated is needed. That is why detailing the way the data was analyzed, which parameters were sampled and how, what convergence criteria was used, … is very important. The best framework to publish the details of an experimental analysis is given by Open Science. Through Open Access, the paper discussing the methods and results can be made widely available. Open Data is how the results files, but also, possibly, intermediate analysis files (like angular cross sections, flux, …) can be distributed, so that some kind of data reanalysis, or complementary work can be performed. And, with publication of the analysis code in Open Source, the whole process of analysis can be studied, redone (if new information come to light that warrant a reanalysis) or simply used in other settings. In particular in the case of uncertainty and covariance, where the process that leads to the final uncertainties is important, having the complete analysis code to play with is a significant step in helping the final user go back to the source of the final result (literally).
Therefore, not only for 183W results, but also in the future (in particular with 239Pu data, or experiments at NFS), the code will be our primary tool for data analysis. And the philosophy behind it (detail description of analysis, Open Source code, …) our driving principles.
Working with an Open Science outlook, in particular in the context of collegial research (involving researchers, students, interns, …) requires, to be easy and successful, following a few procedures. Some of these procedures have been presented here. They, of course, should not be followed strictly. But they present a framework, a pattern, for the research supervisor to conduct research with their team. Combined with the principles of Open Science, these time and project management tips are also really helpful to maintain research continuity, i.e. ensuring the continuation of the research work when members of the team move to other labs or field, a student finishes his PhD., a hard drive crashes, …
This manuscript and the work associated, following the Open Science principles, is available in its entirety in Open Access: the text, its sources (text in reStructure Text, figures data and generation scripts) the compilation script to generate the HTML pages or PDF, some associated documents (an example of DMP) or scripts.
TL;DR [1]
There is a need for improved nuclear data evaluations for the development of nuclear applications.
These improvements require new measurements and better theoretical descriptions by models.
The group DNR at IPHC focuses its work on \((\text{n},~x\text{n})\) reaction, with the measurements of \((\text{n},~x\text{n}~\gamma)\) cross sections.
In these document, I present results for 183W, following past results on even-even isotopes.
A full Monte Carlo analysis code was developed to analyze the data taken with the Grapheme setup at Gelina.
The analysis code is extensively detailed, in order to explain how the data is produced, allowing for meaningful evaluation.
Embracing the principles of Open Science, the code will be distributed in Open Source, and the results made available as Open Data. Such openness is the best way to provide the transparency needed for proper exploitation of our results.
This Open Science approach, as well as methods to help conform to it, will be our guiding principle in the future.
Footnotes