Statistical Global Modeling of β−-Decay Halflives Systematics Using Multilayer Feedforward Neural Networks and Support Vector Machines


N. Costiris
E. Mavrommatis
K. A. Gernoth
J. W. Clark
H. Li
Abstract

In this work, the beta-decay halflives problem is dealt as a nonlinear optimiza- tion problem, which is resolved in the statistical framework of Machine Learning (LM). Continuing past similar approaches, we have constructed sophisticated Artificial Neural Networks (ANNs) and Support Vector Regression Machines (SV Ms) for each class with even-odd character in Z and N to global model the systemat- ics of nuclei that decay 100% by the β−-mode in their ground states. The arising large-scale lifetime calculations generated by both types of machines are discussed and compared with each other, with the available experimental data, with previous results obtained with neural networks, as well as with estimates coming from tradi- tional global nuclear models. Particular attention is paid on the estimates for exotic and halo nuclei and we focus to those nuclides that are involved in the r-process nucleosynthesis. It is found that statistical models based on LM can at least match or even surpass the predictive performance of the best conventional models of β-decay systematics and can complement the latter.

Article Details
  • Section
  • Poster contributions
References
NUPECC Long Range Plan 2004: Perspectives for Nuclear Physics Research in Europe in the Coming Decade and Beyond (April 2004); Opportunities in Nuclear Science: The Frontiers of Nuclear Science: A Long Range Plan (DOE/NSF, 2007).
K. Takahashi, M. Yamada and T. Kondoh, At. Data Nucl. Data Tables 12 (1973) 101
H. Nakata, T. Tachibana and M. Yamada, Nucl. Phys. A625 (1997) 521.
For the latest models of the Klapdor group see: H. Homma, M. Bender, M. Hirsch, K. Muto, H. V. Klapdor and T. Oda, Phys. Rev. C54:6 (1996) 2972; J. U. Nabi and H. V. Klapdor, At. Data Nucl. Data Tables 71 (1999) 149; ibid 88 (2004) 237.
P. Moeller, J. R. Nix and K.-L. Kratz, At. Data Nucl. Data Tables 66 (1997) 131 and references therein.
For the latest calculations see: H. Grawe, K. Langanke and G. Martinez - Pinedo, Rep. Progr. Phys. 70 (2007) 1525.
P. Moeller, B. Pfeiffer and K.-L. Kratz, Phys. Rev. C67:5 (2003) 055802.
See for ex.: J. Engel et al., Phys. Rev. C60 (1999) 014302; I. N. Borzov, Nucl. Phys. A777 (2006) 645.
T. Niksic, T. Marketin, D. Vretenar, N. Paar and P. Ring, Phys. Rev. C71:1 (2005) 014308
T. Marketin, D. Vretenar and P.Ring, Phys. Rev. C75 (2007) 024304.
J. W. Clark, in Scientific Applications of Neural Nets, J. W. Clark, T. Lindenau and M. L. Ristig (eds) (Springer-Verlag, Berlin, 1999) p1; K. A. Gernoth, ibid, p. 139.
N. J. Costiris, E. Mavrommatis, K. A. Gernoth and J. W. Clark, in Proceedings of the 16th Hellenic Symposium of the Hellenic Nuclear Physics Society, (Athens, 2006) 210 (nucl-th/0701096).
J. W. Clark and H. Li, Int.J.Mod.Phys. B20 (2006) 5015 (nucl-th/0603037).
E. Mavrommatis, A. Dakos, K. A.Gernoth and J. W. Clark, in Condensed Matter Theories, Vol. 13, edited by J. da Providencia and F. B. Malik (Nova Sciences Publishers, Commack, NY, 1998) 423.
S. Haykin, Neural Networks: A Comprehensive Foundation (McMillan, N.Y.) 1993.
V. Vapnik, The Nature of Statistical Learning Theory (Springer, N.Y.) 1995.
N. Costiris, Diploma Thesis (University of Athens, Greece) 2006.
H. Li, Ph-D Thesis (Washington University, USA) 2008.
G. Audi, O. Bersillon, J. Blachot and A. H. Wapstra, Nucl. Phys. A729 (2003) 3.
B. Pfeiffer, K. L. Kratz and P. M ̈oller, (Institut Fu ̈r Kernchemie, Internal Report) 2003.