UDK 519.7
DOI: 10.15507/2658-4123.029.201902.205-217
The Neural Network Analysis of Normality of Small Samples of Biometric Data through Using the Chi-Square Test and Anderson–Darling Criteria
Vladimir I. Volchikhin
President, Penza State University (40 Krasnaya St., Penza 440026, Russia), D.Sc. (Engineering), Professor, ResearcherID: O-9718-2015, ORCID: https://orcid.org/0000-0002-9986-521X, This email address is being protected from spambots. You need JavaScript enabled to view it.
Aleksandr I. Ivanov
Head, Laboratory of Biometric and Neural Network Technologies, Penza Research Electrotechnical Institute (9 Sovetskaya St., Penza 440000, Russia), D.Sc. (Engineering), Associate Professor, ResearcherID: R-4514-2019, ORCID: https://orcid.org/0000-0003-3475-2182, This email address is being protected from spambots. You need JavaScript enabled to view it.
Alexander V. Bezyaev
Doctoral Candidate, Chair of Information Security of Systems and Technologies, Penza State University (40 Krasnaya St., Penza 440026, Russia), Ph.D. (Engineering), ResearcherID: Q-9589-2019, ORCID: https://orcid.org/0000-0003-0703-3270, This email address is being protected from spambots. You need JavaScript enabled to view it.
Evgeniy N. Kupriyanov
Graduate Student, Chair of Information Security of Systems and Technologies, Penza State University (40 Krasnaya St., Penza 440026, Russia), Publons: https://publons.com/researcher/2956834/evgenyi-kupriyanov, ORCID: https://orcid.org/0000-0003-0806-1476, This email address is being protected from spambots. You need JavaScript enabled to view it.
Introduction. The aim of the work is to reduce the requirements to test sample size when testing the hypothesis of normality.
Materials and Methods. A neural network generalization of three well-known statistical criteria is used: the chi-square criterion, the Anderson–Darling criterion in ordinary form, and the Anderson–Darling criterion in logarithmic form.
Results. The neural network combining of the chi-square criterion and the Anderson–Darling criterion reduces the sample size requirements by about 40 %. Adding a third neuron that reproduces the logarithmic version of the Andersоn–Darling test leads to a small decrease in the probability of errors by 2 %. The article deals with single-layer and multilayer neural networks, summarizing many currently known statistical criteria.
Discussion and Conclusion. An assumption has been made that an artificial neuron can be assigned to each of the known statistical criteria. It is necessary to change the attitude to the synthesis of new statistical criteria that previously prevailed in the 20th century. There is no current need for striving to create statistical criteria for high power. It is much more advantageous trying to ensure that the data of newly synthesized statistical criteria are low correlated with many of the criteria already created.
Keywords: chi-square test, Anderson–Darling criterion, artificial neural network, statistical criterion, neural network reproduction of statistical criteria, neural network analysis, small sample
For citation: Volchikhin V.I., Ivanov A.I., Bezyaev A.V., Kupriyanov E.N. The Neural Network Analysis of Normality of Small Samples of Biometric Data through Using the Chi-Square Test and Anderson–Darling Criteria. Inzhenernyye tekhnologii i sistemy = Engineering Technologies and Systems. 2019; 29(2):205-217. DOI: https://doi.org/10.15507/2658-4123.029.201902.205-217
Contribution of the authors: V. I. Volchikhin – the development of the concept of a neural network association of statistical criteria; A. I. Ivanov – the formalization of the neural network description of the statistical criteria under consideration; A. V. Bezyaev – the adjustment of the output codes of a neural network that generalizes statistical criteria; E. N. Kupriyanov – the software implementation of calculations and formation of tables with data.
All authors have read and approved the final version of the paper.
Received 15.02.2019; revised 25.04.2019; published online 28.06.2019
REFERENСЕS
1. Ivanov A.I. Biometric identification based on the dynamics of subconscious movements: a monograph. Penza: Publishing House of PSU; 2000. (In Russ.)
2. Volchikhin V.I., Ivanov A.I., Funtikov V.A. Fast learning algorithms for neural network mechanisms of biometric-cryptographic information protection: a monograph. Penza: Publishing House of PSU; 2005. (In Russ.)
3. Gorban A.N., Kégl B., Wunsch D.C., Zinovyev A.Y. Principal manifolds for data visualisation and dimension reduction. Lecture Notes in Computational Science and Engineering. Springer; 2007. Vol. 58.
4. Ivanov A.I., Perfilov K.A., Malygina E.A. Multivariate statistical analysis of the quality of biometric data on extremely small samples using the criteria of the geometric mean calculated for the analyzed probability functions. Izmereniye. Monitoring. Upravleniye. Kontrol = Measuring. Monitoring. Management. Control. 2016; 2:64-72. (In Russ.)
5. Bezyaev A.V. Neural network biometrics – to-self-correcting code converter without redundancy. Neyrokompyutery: razrabotka, primenenie = Neurocomputers. 2012; 3:52-56. (In Russ.)
6. Bezyaev A.V., Ivanov A.I., Funtikova Yu.V. Optimization of the structure self-correcting bio-code, storing syndromes error as fragments hash-functions. Vestnik URFO. Bezopasnost v informatsionnoy sfere = UrFR Newsletter. Information Security. 2014; 3:4-13. (In Russ.)
7. Volchikhin V.I., Ivanov A.I., Bezyaev A.V., Elfimov A.V., Yunin A.P. Evaluation of the calculation acceleration effect, caused by the support of quantum superposition states during adjustment of output conditions of a “biometrics – code” neural network converter. Izvestiya vysshikh uchebnykh zavedeniy. Povolzhskiy region. Tekhnicheskie nauki. Informatika, vychislitelnaya tekhnika = University Proceedings. Volga Region. Engineering Sciences. Computer Science, Computer Engineering and Control. 2017; 1:43-55. (In Russ.)
8. Ivanov A.I. Comparative analysis of the indicators of competing technologies of biometric-cryptographic authentication of the person. Zashchita informatsii. INSAYD = Protection of Information. INSIDE. 2014; 3:32-39. (In Russ.)
9. Ackley D.H., Hinton G.E., Sejnowski T.J. A learning algorithm for Boltzmann machines. Cognitive Science. 1985; 9(1):147-169.
10. Hinton G.E. Training products of experts by minimizing contrastive divergence. Neural Computation. 2002; 14(8):1771-1800.
This work is licensed under a Creative Commons Attribution 4.0 License.