Scientific Journal of KubSAU

Polythematic online scientific journal
of Kuban State Agrarian University
ISSN 1990-4665
AGRIS logo UlrichsWeb logo DOAJ logo

Name

Orlov Alexander Ivanovich

Scholastic degree




Academic rank

professor

Honorary rank

Organization, job position

Bauman Moscow State Technical University
   

Web site url

Email

prof-orlov@mail.ru


Articles count: 150

Sort by: Date Title Views
304 kb

CHARACTERIZATION OF AVERAGE VALUES BY MEANS OF MEASUREMENT SCALES

abstract 1341710070 issue 134 pp. 853 – 883 29.12.2017 ru 274
According to measurement theory, statistical data are measured in various scales. The most widely used ordinal scale, scales of intervals and relations. Statistical methods of data analysis should correspond to the scales in which the data is measured. The term "correspondence" is specified with the help of the concepts of an adequate function and an allowable scale transformation. The main content of the article is a description of the average values that can be used to analyze data measured in the ordinal scale, interval and relationship scales, and some others. The main attention is paid to the means for Cauchy and the means for Kolmogorov. In addition to the mean, from this point of view, polynomials and correlation indices are also analyzed. Detailed mathematical proofs of characterization theorems are given for the first time in scientific periodicals. It is shown that in the ordinal scale there are exactly n average values, that can be used, namely, n order statistics. The proof is represented as a chain of 9 lemmas. In the scale of intervals from all Kolmogorov means, only the arithmetic mean can be used. In the scale of relations from all the Kolmogorov means, only the power means and the geometric mean are permissible. The kind of adequate polynomials in the relationship scale is indicated
183 kb

CHARACTERIZATION OF MODELS WITH DISCOUNTING

abstract 1531909022 issue 153 pp. 211 – 227 29.11.2019 ru 120
Among the widely used economic-mathematical models, dynamic programming plays an important role, and among them, models with discounting. The most famous example is the model for calculating the net present value (NPV) as an estimate of the efficiency of the investment project. In the article, it is clarified which features are distinguished by models with discounting among all models of dynamic programming. In models with discounting, the comparison of plans does not change when the time of the beginning of the implementation of plans changes, ie. there is a stability of the results of comparing plans. It is proved that if the results of comparing plans for 1 and 2 steps are stable in the dynamic programming model, then this model is a model with discounting. This theorem shows that the introduction of discounted functions for the estimation of the effect is justified only in stable economic conditions in which the orderliness of managerial decisions does not change from year to year. In other words, if at the beginning of the period under consideration the first solution is better than the second, then at all other times, up to the end of the period under consideration, the first solution is better than the second. Stable economic conditions are rarely found in the modern economy with its constant changes, including those caused by innovations. Therefore, the decision to choose (to implement) an investment project from a set of possible ones can not be based solely on the calculation of discounted project performance indicators, such as net present value and internal rate of return. Such indicators can only play a supporting role. Decide on the choice of an investment project for implementation is necessary on the basis of the whole range of social, technological, environmental, economic, political factors
4179 kb

COGNITIVE FUNCTIONS AS A GENERALIZATION OF THE CLASSICAL CONCEPT OF FUNCTIONAL DEPENDENCE ON THE BASIS OF INFORMATION THEORY IN ASC-ANALYSIS AND SYSTEM FUZZY INTERVAL MATHEMATICS

abstract 0951401007 issue 95 pp. 122 – 183 30.01.2014 ru 1660
This article briefly reviews the classical concept of functional dependence in mathematics, determines the limitations of this concept for adequate modeling of reality and formulates the problem, consisting in search of such generalization of the concept of func-tions, which is more suitable for the adequate reflec-tion of causal relationships in the real domain. Also, it discusses theoretical and practical solving the prob-lem, consisting in: (a) we suggest the universal method of calculating the amount of information in the value of argument about the meaning of the function, i.e. cognitive functions which is independent from the subject area; b) we offer software tools: Eidos intelli-gent system, allowing in practice to carry out these calculations, i.e. to build cognitive functions based on a fragmented noisy empirical data of high dimension. We also offer the concepts of nonreducing, partially and completely reduced direct and inverse, positive and negative cognitive functions and the method of formation of reduced cognitive function, which is a generalization of known weighted least-squares meth-od on the basis of observation the amount of infor-mation in the values of the argument about the values of the functions accounting
290 kb

COMPUTER-STATISTICAL METHODS: STATE AND PROSPECTS

abstract 1031409012 issue 103 pp. 163 – 195 30.11.2014 ru 914
We have analyzed the current state of the main computer-statistical methods, identified achievements and existing problems, outlined the prospects of further movement and formulated the problems to be solved. We have also discussed the Monte Carlo methods, pseudo-random numbers, simulation, bootstrap and resampling, the automated system-cognitive analysis. We have considered the applications of computer statistics in controlling and properties of statistical packages as the tools for researchers
267 kb

CONSEQUENCES OF DECISIONS FOR SCIENCE-TECHNOLOGY AND ECONOMIC DEVELOPMENT

abstract 1131509029 issue 113 pp. 355 – 387 30.11.2015 ru 842
The real facts presented in this article, demonstrate the great importance in today's world of strategic management, methods of analyses of innovations and investments and the role of the theory of decision-making in these economic disciplines. We have given the retrospective analysis of the development of nuclear physics research. For the development of fundamental and applied science in the second half of the twentieth century, we had a very great importance of the two events: the decision of US President Roosevelt to deploy nuclear program (adopted in response to a letter from Einstein) and the coincidence in time between the completion of the construction of nuclear bomb and the end of World War II. The nuclear bombing of Hiroshima and Nagasaki has determined the developments in science and technology for the entire second half of the twentieth century. For the first time in the entire history of the world the leaders of the leading countries clearly seen that fundamental research can bring great practical benefit (from the point of view of the leaders of countries). Namely, they can give the brand new super-powerful weapon. The consequence was a broad organizational and financial support of fundamental and deriving from them applied research. Is analyzed the influence of fundamental and applied research on the development and effective use of new technology and technical progress. We consider the development of mathematical methods of research and information technology, in particular, the myth of "artificial intelligence"
198 kb

CONTROLLING OF CONSUMER PRICES DYNAMICS AND LIVING WAGE

abstract 1261702030 issue 126 pp. 403 – 421 28.02.2017 ru 277
In accordance with the Presidential Decree of 21 August 2012 № 1199 one of the 11 integrated indicators of the activity of executive authorities is the measure "real disposable income of the population". For its calculation it is necessary to measure the level of consumer prices. The article presents the minimum consumer basket of physiologically essential food products, designed in 1993 by the Institute of High Statistical Technologies and Econometrics (IHSTE) based on the initial data of the Institute of Nutrition of the Russian Academy of Medical Sciences, and the results of measuring the cost of the consumer basket IHSTE and inflation index in 24 years (1993 - 2017). We discuss the application of the developed tools in Controlling of the level of consumer prices and living wage. According to M. Orshansky, living wage can be estimated by multiplying the cost of the minimum food basket by a factor which is equal to the quotient of all costs to the costs of food costs for a poor family. This work is aimed at the elimination of the monopoly of Rosstat in the calculation of indices of inflation, the living wage and the real disposable income of the population. The methods of the measurement and the use of inflation constitute an important part of training courses in econometrics, which are taught in the context of the scientific-educational complex "Engineering Business and Management" of the Baumann Moscow State Technical University. Nobel Laureate in Economics Vasiliy Leontiev thought that only 1% of economists analyze the newly collected data, 30% use the data contained in the publications of predecessors, and the rest did not turn in their arguments to the real world. This work belongs to the 1% of publications (which analyzes the newly collected data), about which Vasiliy Leontiev wrote
285 kb

CURRENT STATUS OF NONPARAMETRIC STATISTICS

abstract 1061502017 issue 106 pp. 239 – 269 28.02.2015 ru 1282
Nonparametric statistics is one of the five points of growth of applied mathematical statistics. Despite the large number of publications on specific issues of nonparametric statistics, the internal structure of this research direction has remained undeveloped. The purpose of this article is to consider its division into regions based on the existing practice of scientific activity determination of nonparametric statistics and classify investigations on nonparametric statistical methods. Nonparametric statistics allows to make statistical inference, in particular, to estimate the characteristics of the distribution and testing statistical hypotheses without, as a rule, weakly proven assumptions about the distribution function of samples included in a particular parametric family. For example, the widespread belief that the statistical data are often have the normal distribution. Meanwhile, analysis of results of observations, in particular, measurement errors, always leads to the same conclusion - in most cases the actual distribution significantly different from normal. Uncritical use of the hypothesis of normality often leads to significant errors, in areas such as rejection of outlying observation results (emissions), the statistical quality control, and in other cases. Therefore, it is advisable to use nonparametric methods, in which the distribution functions of the results of observations are imposed only weak requirements. It is usually assumed only their continuity. On the basis of generalization of numerous studies it can be stated that to date, using nonparametric methods can solve almost the same number of tasks that previously used parametric methods. Certain statements in the literature are incorrect that nonparametric methods have less power, or require larger sample sizes than parametric methods. Note that in the nonparametric statistics, as in mathematical statistics in general, there remain a number of unresolved problems
599 kb

DETECTION OF DEVIATIONS IN CONTROLLING SYSTEM (FOR EXAMPLE, MONITORING THE LEVEL OF FLIGHT SAFETY)

abstract 0951401008 issue 95 pp. 184 – 203 30.01.2014 ru 1137
Control charts are proposed to use as a tool to detect deviations in the controlling system. This proposal is considered for monitoring flight safety. Possibility of use in practice of airlines of a new indicator of flight safety level and a new method of its monitoring is discussed. As an indicator the ERC of ARMS group, and as a method of monthly and weekly monitoring – a method of the cumulative sums are offered
272 kb

DEVELOPMENT OF SOLIDARY INFORMATION ECONOMY

abstract 1211607007 issue 121 pp. 262 – 291 30.09.2016 ru 513
We are developing a new organizational-economic theory - solidary information economy, based on the views of Aristotle. The name of this theory has changed over time. Initially, we used the term "nonformal information economy of the future", and then began to use the term "solidary information economy." In connection with Biocosmology and neo-Aristotelism preferred is an adequate term "functionalist organic information economy". This article summarizes the first phase of work on the solidary information economy. We have analyzed the array of publications. The main problems are discussed, the solution of which is devoted to research related to the considered basic organizational and economic theory. The founder of the economic theory is Aristotle. We discuss Aristotle's positions, on which the economic theory is based, in particular, solidary information economy. We prove that the market economy has remained in the XIX century and the mainstream in modern economic science - justification of insolvency of a market economy and the need to move to a planned system of economic management. We examine the impact of ICT on economic activity. We develop the approaches to decision-making in the solidary information economy. On the basis of modern decision theory (especially expert procedures) and informationcommunication technologies people can get rid of chrematistics and will understand the term of "economy" according to Aristotle
244 kb

DISTANCES IN THE SPACES OF STATISTICAL DATA

abstract 1011407013 issue 101 pp. 227 – 252 30.09.2014 ru 1276
The core of applied statistics is statistics in spaces of arbitrary nature, based on the use of distances and optimization problems. This article discusses the various distances in spaces of statistical data, in particular, their conclusions on the basis of appropriate systems of axioms. The conditions and proofs of theorems first published in scientific periodicals
.