Phone: +7(495) 772-9590 * 26311, 26034
119049 Moscow, Shabolovka st. 28/11, b.4, room 1203
This tutorial discusses large scientific projects and the volumes of data generated by them, provides an overview of scientific computer networks that allow high-speed transmission of large amounts of data for these projects; computing systems offered by leading manufacturers of computer equipment for processing large amounts of data, and providing both the ability to store large amounts of data, including distributed, as well as analytics and parallel data processing in real time. Particular attention is paid to the safety of scientific information transmitted.
in the present paper author explains the results of using Smart TV as a tool for Industry 4.0, in particular for media industry, also measuring of Quality-of-Service and new business development. A Smart TV is a single connected device or intelligent sensor which increases industry performance through the number of services by using the existing network infrastructure. Thanks to special tracking and analyzing information on board Smart TVs help to improve the service for VoD service provider and product quality for Vendor. Results of applying several methods for problem solving will be reviewed at present material
This work is devoted to the investigation of particle acceleration during magnetospheric dipolarizations. A numerical model is presented
taking into account the four scenarios of plasma acceleration that can be realized: (A) total dipolarization with characteristic time scales of
3 min; (B) single peak value of the normal magnetic component Bz occurring on the time scale of less than 1 min; (C) a sequence of rapid
jumps of Bz interpreted as the passage of a chain of multiple dipolarization fronts (DFs); and (D) the simultaneous action of mechanism (C)
followed by the consequent enhancement of electric and magnetic fluctuations with the small characteristic time scale 1 s. In a frame of the
model, we have obtained and analyzed the energy spectra of four plasma populations: electrons e, protons Hþ, helium Heþ, and oxygen Oþ
ions, accelerated by the above-mentioned processes (A)–(D). It is shown that Oþ ions can be accelerated mainly due to the mechanism (A);
Hþ and Heþ ions (and to some extent electrons) can be more effectively accelerated due to the mechanism (C) than the single dipolarization
(B). It is found that high-frequency electric and magnetic fluctuations accompanying multiple DFs (D) can strongly accelerate electrons e
and really weakly influence other populations of plasma. The results of modeling demonstrated clearly the distinguishable spatial and temporal
resonance character of particle acceleration processes. The maximum particle energies depending on the scale of the magnetic acceleration
region and the value of the magnetic field are estimated. The shapes of energy spectra are discussed.
The paper deals with cyclostationarity as a natural extension of stationarity as the key property in designing the widely-used models of random processes. The comparative example of two processes, one is wide-sense stationary and the other is wide-sense cyclostationary, is given in the paper and reveals the lack of the conventional stationary description based on one-dimensional autocorrelation functions. It is shown that two significantly different random processes appear to be characterized by exactly the same autocorrelation function while their two-dimensional autocorrelation functions provide outlook where the difference between processes of two above-mentioned classes becomes much clearer. More concise representation by expanding the two-dimensional autocorrelation function to its Fourier series where the cyclic frequency appears as the transform parameter is illustrated. The closed-form expression for the components of the cyclic autocorrelation function is also given for the random process which is an infinite pulse train made of rectangular pulses with randomly varying amplitudes.
Today’s dominant design for the Internet of Things (IoT) is a Cloud-based system, where devices transfer their data to a back-end and in return receive instructions on how to act. This view is challenged when delays caused by communication with the back-end become an obstacle for IoT applications with, for example, stringent timing constraints. In contrast, Fog Computing approaches, where devices communicate and orchestrate their operations collectively and closer to the origin of data, lack adequate tools for programming secure interactions between humans and their proximate devices at the network edge. This paper fills the gap by applying Action-Oriented Programming (AcOP) model for this task. While originally the AcOP model was proposed for Cloud-based infrastructures, presently it is re-designed around the notion of coalescence and disintegration, which enable the devices to collectively and autonomously execute their operations in the Fog by serving humans in a peer-to-peer fashion. The Cloud’s role has been minimized—it is being leveraged as a development and deployment platform.
Urban greenery such as trees can effectively reduce air pollution in a natural and eco-friendly way. However, how to spatially locate and arrange greenery in an optimal way remains as a challenging task. We developed an agent-based model of air pollution dynamics to support the optimal allocation and configuration of tree clusters in a city. The Pareto optimal solutions for greenery in the city were computed using the suggested heuristic optimisation algorithm, considering the complex absorptive-diffusive interactions between agent-trees (tree clusters) and air pollutants produced by agent-enterprises (factories) and agent-vehicles (car clusters) located in the city. We applied and tested the model with empirical data in Yerevan, Armenia, and successfully found the optimal strategy under the budget constraint: planting various types of trees around kindergartens and emission sources.
In the paper is proposed an algorithm and functional model for identifying key indicators of the financial performance of sports clubs. The obtained indicators are representative and recommended to managers for monitoring and management. Relevant automation recommendations have been developed that will help managers make decisions and minimize the costs of managerial errors.
Evolution on changing fitness landscapes (seascapes) is an important problem in evolutionary biology. We
consider the Moran model of finite population evolution with selection in a randomly changing, dynamic
environment. In the model, each individual has one of the two alleles, wild type or mutant. We calculate the
fixation probability by making a proper ansatz for the logarithm of fixation probabilities. This method has been
used previously to solve the analogous problem for the Wright-Fisher model. The fixation probability is related to
the solution of a third-order algebraic equation (for the logarithm of fixation probability).We consider the strong
interference of landscape fluctuations, sampling, and selection when the fixation process cannot be described by
the mean fitness. Such an effect appears if the mutant allele has a higher fitness in one landscape and a lower
fitness in another, compared with the wild type, and the product of effective population size and fitness is large.
We provide a generalization of the Kimura formula for the fixation probability that applies to these cases. When
the mutant allele has a fitness (dis-)advantage in both landscapes, the fixation probability is described by the
An age-structured bioeconomic model, which is completely continuous in age and time, is developed in order to compare with traditional discrete models. Both types have advantages and disadvantages. The continuous framework complements discrete models as it allows for deeper and more transparent analytical study and leads to analytical results that would be difficult to achieve within a discrete framework. To make the model realistic, a nonlinear recruitment function is introduced and steady state solutions and constant-effort optimal fishing are studied analytically. In addition, the framework has been used for numerical analysis. Simulations are used to investigate how optimal harvesting patterns vary with parameter values.
One of the recent major steps towards 5G cellular systems is standardization of 5G New Radio (NR) operatingin the millimeter wave (mmWave) frequency band. This radio access technology (RAT) will potentially provideextraordinary rates at the access interface enabling the set of new bandwidth-greedy applications. However,the blockage of the line-of-sight (LoS) path between 3GPP NR access point (AP) and the user equipment (UE)is known to drastically degrade the performance of the NR communication links thus leading to potentialoutage conditions. Although the problem of characterizing LoS blockage process has been addressed in therecent literature, the proposed models are mostly limited to stationary locations of APs and UE. In our study,we characterize properties of the LoS blockage process under simultaneous mobility of both blockers and UE.The model is then extended to the cases of Poisson AP deployment, multi-connectivity, and mobility of APrepresenting ‘trilateral’ (three-sided) mobility model. We also specify a Markov-based model of the blockageprocess that can be efficiently used in both system level simulations and analytical analysis of 3GPP NR systems.Using this model we demonstrate how to derive various metrics of interest including (i) ,fraction of time inblockage, (ii) SNR and capacity process dynamics, (iii) probability that at time𝑡UE is at the blockage ornon-blockage state, (iv) mean and distribution of time to an outage.
The current challenges of many mobility solutions are based on an extremely fragmented booking system with complex service layers. A cross-company and user-friendly exchange of information and offers from different mobility providers is often not possible. Against this background, Distributed Ledger Technology (DLT) has the potential to revolutionize the existing mobility sector and enable completely new business models. Thus, we present a distributed mobility platform, which is valuable for a variety of mobility services. In contrast to conventional platform approaches, the data management of our infrastructure is distributed, transparent, and cost-efficient. By prototypically implementing the concept, we can demonstrate its technical feasibility and at the same time demonstrate that the introduction of our distributed mobility concept will benefit both the supply and demand sides of public transportation.
Patient flow modeling in healthcare plays a large role in understanding the operation of the system and its characteristics. Besides, modeling techniques can significantly improve the effectiveness of the medical facilities. The existing level of automation in these facilities enables the accumulation of large amounts of various data. Therefore, the collected data might be considered as the resource of new valuable knowledge. A novel approach to automatically identify the groups of similar clinical pathways based on event hospital data is presented in the paper. More specifically, the approach summarizes the most frequent pathways by implementing hard and soft clustering algorithms in order to describe the behavior patterns. The obtained clusters of clinical pathways serve as a starting point for the development of a personalized approach in modelling the heterogeneous patient flow in urban medical facilities. The results indicate the suitability of multidimensional time series clustering and Additive Regularization of Topic Models (ARTM) for the clinical event data.
In the carried out research is confirmed that the application of the Hurst exponent in forecasting the time series of stock prices and stability of trend can improve forecasting results, but it is possible only in the short-term time period. The Hurst exponent can be used as an additional indicator (data risk in forecasting) and can improve forecast data reliability in the large-scale investment systems.
This paper introduces the maximum likelihood estimator (MLE) based on artificial neural network (ANN) for a fast computation of the bearing that indicates the direction to the source of the electromagnetic wave received by a passive radar system equipped with an array antenna. Authors propose the cascade scheme for ANN training phase where the network is fed with the pair-wise delays of received stationary or cyclostationary signals and the output of the network goes to the input of the target function being maximized together with the same data. The designed ANN topology has the modified output layer consisting of the custom neuron that implements argument function of a complex number rather than linear or sigmoid-like ones used in the conventional multilayer perceptron topologies. The simulation carried out for the ring array antenna shows that a single estimation obtained via ANN MLE takes 12 times less computational time comparing to the MLE implemented via the numerical optimization technique. The degradation of accuracy measured as the increase of mean-squared error does not exceed 10% of the potential value for the particular signal-to-noise ratio (SNR) and that difference has no tendency to decrease for higher SNR. The estimation error appeared to be independent from the true value in the wide range of bearings.
Information technology (IT) is an indispensable tool for any organization today, so the choice of adequate IT solutions is a critically important skill. In the literature, many methods for selecting IT solutions have been proposed, but often they use vague criteria that are very difficult to quantify and complex methods to compare alternatives. So, the application of these methods outside the theoretical articles is restricted, since practitioners need simpler approaches. We propose a simple method of the evaluation of alternative IT solutions based on five criteria, namely the cost of ownership, the time for the change, security risks, acceptance by users, and confidence in the supplier's ability to implement the solution. In accordance with the theory of probabilistic mental models, a reference class is proposed for each criterion and variables that can be measured quantitatively are chosen on its base. To simplify the decision-making process, a weighted production model is used for the comparison of alternatives.
Mathematical modeling of a stock market functioning is one of the actual and at the same time complex task of the modern theoretical economics. From our point of view, building such mathematical models “ab initio”, by using analogy between the stock market and a certain physical system (in our work, laser), is the most promising approach. This paper proposes a simple econophysical model of stock market as an open nonequilibrium system in form of Lorenz–Haken equation. In this system, variation of ask price, variation of bid price, and instantaneous difference between numbers of agents in active and passive state are intensity of external information flow is a control parameter. This model explains the impossibility of existence of an equilibrium state of the market and shows the presence of deterministic chaos in a stock market.
Boosting motivation is a challenging task on the way of additional productivity, especially considering new generations Y and Z. In this paper we combine set of methods from process management, pedagogics and psychology to develop an interactive gamification process and test it for generations Y and Z currently studying in the leading Russian university. The efficiency of approach is demonstrated on several models and authors suggest ways how to implement them.
The article is devoted to the analysis of scoring models used in one of the Russian commercial banks. The purpose of the article is to build a comprehensive scoring model that takes into account various groups of additional variables that increase the accuracy of the model and reduce the default percentage of borrowers. To construct such a model, it is proposed to use fuzzy control technologies, as one of the methods of data mining.
The unprecedented proliferation of smart devices together with novel communication, computing, and control technologies have paved the way for A-IoT. This development involves new categories of capable devices, such as high-end wearables, smart vehicles, and consumer drones aiming to enable efficient and collaborative utilization within the smart city paradigm. While massive deployments of these objects may enrich people's lives, unauthorized access to said equipment is potentially dangerous. Hence, highly secure human authentication mechanisms have to be designed. At the same time, human beings desire comfortable interaction with the devices they own on a daily basis, thus demanding authentication procedures to be seamless and user-friendly, mindful of contemporary urban dynamics. In response to these unique challenges, this work advocates for the adoption of multi-factor authentication for A-IoT, such that multiple heterogeneous methods - both well established and emerging - are combined intelligently to grant or deny access reliably. We thus discuss the pros and cons of various solutions as well as introduce tools to combine the authentication factors, with an emphasis on challenging smart city environments. We finally outline the open questions to shape future research efforts in this emerging field.
In this article we aim to highlight the problems related to the structure and stability of the
comparatively thin current sheets that were relatively recently discovered by space missions in
the magnetospheres of the Earth and planets, as well as in the solar wind. These magnetoplasma
structures are universal in collisionless cosmic plasmas and can play a key role in the processes
of storage and release of energy in the space environment. The development of a self-consistent
theory for these sheets in the Earth’s magnetosphere, where they were first discovered, has a long
and dramatic history. Solution of the problem of the thin current sheet structure and stability
become possible in the framework of a kinetic quasi-adiabatic approach required to explain their
embedding and metastability properties. It was found that the structure and stability of current
structures are completely determined by the nonlinear dynamics of plasma particles. Theoretical
models have been developed to predict many properties of these structures and interpret many
experimental observations in planetary magnetospheres and the heliosphere.