Phone/Fax: (495) 771-32-38
33 Kirpichnaya Ulitsa
School Head — Svetlana Maltseva
Deputy Head of Research and Partnerships — Vasily Kornilov
Deputy Head for Prospective Student and Alumni Affairs — Vladimir Samodurov
Deputy Head for Academics — Olga Tsukanova
Deputy Head for International Relations — Michael Komarov
Mathematical modeling of a stock market functioning is one of the actual and at the same time complex task of the modern theoretical economics. From our point of view, building such mathematical models “ab initio”, by using analogy between the stock market and a certain physical system (in our work, laser), is the most promising approach. This paper proposes a simple econophysical model of stock market as an open nonequilibrium system in form of Lorenz–Haken equation. In this system, variation of ask price, variation of bid price, and instantaneous difference between numbers of agents in active and passive state are intensity of external information flow is a control parameter. This model explains the impossibility of existence of an equilibrium state of the market and shows the presence of deterministic chaos in a stock market.
A simple sociophysical model is proposed to describe the transition between a chaotic and a coherent state of a microblogging social network. The model is based on the equations of evolution of the order parameter, the conjugated field, and the control parameter. The self-consistent evolution of the networks is presented by equations in which the correlation function between the incoming information and the subsequent change of the number of microposts plays the role of the order parameter; the conjugate field is equal to the existing information; and the control parameter is given by the number of strategically oriented users. Analysis of the adiabatic approximation shows that the second-order phase transition, which means following a definite strategy by the network users, occurs when their initial number exceeds a critical value equal to the geometric mean of the total and critical number of users.
A fridge plays an important role in the kitchen in comparison to other appliances because it helps to store food products at optimal conditions for a long period of time. The ordinary refrigerators perfectly allow preserving meals but they are not effective in case of food management. Providing a remote control for home appliances extends the everyday usage of these devices. In addition to the remote control device, some manufacturers use additional modules such as internal cameras and hands-free speaker for convenient control of an appliance. All these devices are able to communicate with each other to reach common goals. The home appliance producer Liebherr in cooperation with technology company Microsoft developed a solution for remote control of refrigeration with possibility of food recognition using Machine Learning algorithms. This option enables automatic compiling of the list of food stored in the fridge and food ordering in an online shop without manual actions. This opportunity enables not only a convenient usage of an appliance but also allows reduction of electricity consumption because user does not open fridge doors frequently as far as he knows a list of food in refrigerator. In this paper we describe SmartDevice technology from Liebherr that was developed for adding smart features to the brand products. In particular, we review main business processes of SmartDevice, discuss advantages and disadvantages of this solution for the end customers and identify future research for creating smart fridges.
A modern enterprise has to react to permanent changes in the business environment by transformation of its own behavior, operational practices and business processes. Such transformations may range from changes of business processes to changes of information systems used to support the business processes, changes in the underlying IT infrastructures and even in the enterprise information system as a whole. The main characteristic of changes in a turbulent business environment and, consequently, in the enterprise information system is unpredictability. Therefore, an enterprise information system should support the operational efficiency of the current business model, as well as provide the necessary level of agility to implement future unpredictable changes of requirements.
This article aims to propose a conceptual model of an agile enterprise information system, which is defined as a working system that should eliminate the largest possible number of gaps caused by external events through incremental changes of its own components. A conceptual model developed according to the socio-technical approach includes structural properties of an agile enterprise information system (actors, tasks, technology, and structure). Structural properties define its operational characteristics, i.e. measurable indicators of agility – time, costs, scope and robustness of process of change. Different ways to build such an agile system are discussed on the basis of axiomatic design theory. We propose an approach to measurement of time, cost, scope and robustness of changes which helps to make quantitative estimation of the achieved level of agility.
We consider interior and exterior initial boundary value problems for the three-dimensional
wave (d’Alembert) equation. First, we reduce a given problem to an equivalent operator
equation with respect to unknown sources deﬁned only at the boundary of the original
domain. In doing so, the Huygens’ principle enables us to obtain the operator equation
in a form that involves only ﬁnite and non-increasing pre-history of the solution in time.
Next, we discretize the resulting boundary equation and solve it eﬃciently by the method
of difference potentials (MDP). The overall numerical algorithm handles boundaries of
general shape using regular structured grids with no deterioration of accuracy. For long
simulation times it offers sub-linear complexity with respect to the grid dimension, i.e., is
asymptotically cheaper than the cost of a typical explicit scheme. In addition, our algorithm
allows one to share the computational cost between multiple similar problems. On multi-
processor (multi-core) platforms, it beneﬁts from what can be considered an effective
parallelization in time.
Researchers face fundamental challenges applying the stochastic geometry framework to analysis of terahertz (THz) communications systems. The two major problems are the principally new propagation model that now includes exponential term responsible for molecular absorption and blocking of THz radiation by the human crowd around the receiver. These phenomena change the probability density function (pdf) of the interference from a single node such that it no longer has an analytical Laplace transform (LT) preventing characterization of the aggregated interference and signal-to-interference ratio (SIR) distributions. The expected use of highly directional antennas at both transmitter and receiver adds to this problem increasing the complexity of modeling efforts. In this paper, we consider Poisson deployment of interferers in ℜ 2 and provide accurate analytical approximations for pdf of interference from a randomly chosen node for blocking and non-blocking cases. We then derive LTs of pdfs of aggregated interference and SIR. Using the Talbot’s algorithm for inverse transform we provide numerical results indicating that failure to capture atmospheric absorption, blocking or antenna directivity leads to significant modeling errors. Finally, we investigate the response of SIR densities to a wide range of system parameters highlighting the specific effects of THz communications systems. The model developed in this paper can be used as a building block for performance analysis of realistic THz network deployments providing metrics such as outage and coverage probabilities.
This study concerns the use of crypto-currency with specific reference to the situation in Russia. A variety of such systems exist; Bitcoin, however, is perhaps the best-known example and will be used as synonymous with the concept throughout this article. Our findings not only show how the views of Russian government bodies are formed and developed, but also sheds light on the specific innovative methods which legal entities use for development of the economy. Consideration will be given to recent developments within Russia which has been more active than many countries in seeking to clarify the status of Bitcoin and providing for the regulation of the technology.
In this article, the requirements for a prospective register of assisted reproductive technologies are considered. The basis of such a register is a specialized data store - the electronic passport of the woman’s reproductive health. This register will serve as a basis to analyse the effectiveness of the use of assisted reproductive technologies and for supporting the making of medical decisions about the likelihood of pregnancy.
This paper discusses experimental verification results of the tools for automatic determining optimal configuration of a virtual machine (VM) implemented based on previously developed models and methods (including the conditions of changing loads).
For information process models, machine learning algorithms with reinforcement are applied. These models are constructed automatically in typed π-calculus taking into account the VM logs.
In order to calculate the optimal configuration of the VM, a machine learning Q-algorithm has been implemented. It features reduction of terms correspondent to information processes on the basis of an abstract machine with states. The implemented method uses an applicative approach in the form of an abstract machine.
We use Cluster and THEMIS simultaneous observations to study the spatial distributions of a shear BY field in the Plasma Sheet (PS) of the Earth's magnetotail at 31 RE < X < 9 RE. The best correlation between the BY field in the PS (BY_PS) and the Y-component of the Interplanetary Magnetic Field (IMF) (BY_IMF) was observed during the quiet PS periods when high speed plasma flows were not detected. During active PS periods the correlation between the BY_PS and BY_ IMF was poor. The analysis of spatial distribution of the BY field along the direction perpendicular to the Current Sheet (CS) plane showed the presence of one of the following configurations, which can be self-consistently generated in the CS: 1) the “quadrupole” distribution of the BY field usually associated with the Hall current system in the vicinity of X-line and 2) the symmetrical “bell-shaped” distribution formed due to the BY amplification near the neutral plane of the CS. Multipoint observations revealed the transient appearance of the “quadrupole” BY distribution during the periods of X-line formation in the mid-tail. This distribution was observed during a few minutes within, at least, 12 RE from the estimated X-line position. On the contrary, the symmetrical “bell-shaped” distribution is more localized in the radial direction and, generally, has a larger observation time (up to ~10 min). Thus, the internal CS perturbations caused either by the Hall currents related to reconnection or by the peculiarities of the local quasi-adiabatic ion dynamics sufficiently affect the shear BY field existing in the magnetotail due to the partial IMF penetration.
In the paper presents the method of classification analysis of the dynamic characteristics of IT companies shares exchange value. It is determined the influence of dynamic indicators of the market shares value on their profitability.
In the 1960s, the so-called “software crisis” triggered the advent of software engineering as a discipline. The idea was to apply the engineering methods of material production to the new domain of large-scale concurrent software systems in order to make the software projects more accurate and predictable. This software engineering approach was feasible, though the methods and practices used had to differ substantially from those used in the material production. The focus of the software engineering discipline was the “serial” production of substantially large-scale, complex and high quality software systems. Researchers argue whether the crisis in software engineering is over yet. The software crisis originates from a number of factors; these are human-related and technology-related factors. To manage this crisis, the authors suggest a set of software engineering methods, which systematically optimize the lifecycles for both types of these influencing factors. This lifecycle optimization strategy includes crisis-responsive methodologies, system-level architectural patterns, informing process frameworks, and a set of knowledge transfer principles. Software development usually involves customers, developers and their management; each of these parties has different preferences and expectations. These parties often differ in their vision of the resulting product; typically, the customers focus on business value while the developers are concerned with technological aspects. Such a difference in focus often results in crises. Thus, the software crises often have a human factor-related root cause. To deal with these kind of crises, software engineers should enhance their skillset with managerial skills, such as teamwork, communications, negotiations, and risk management.
Licensed assisted access (LAA) enables the coexistence of long-term evolution (LTE) and WiFi in unlicensed bands, while potentially offering improved coverage and data rates. However, cooperation with the conventional random-access protocols that employ listen-before-talk (LBT) considerations makes meeting the LTE performance requirements difficult, since delay and throughput guarantees should be delivered. In this paper, we propose a novel channel sharing mechanism for the LAA system that is capable of simultaneously providing the fairness of resource allocation across the competing LTE and Wi-Fi sessions as well as satisfying the quality-of-service guarantees of the LTE sessions in terms of their upper delay bound and throughput. Our proposal is based on two key mechanisms: 1) LAA connection admission control for the LTE sessions and 2) adaptive duty cycle resource division. The only external information necessary for the intended operation is the current number of active Wi-Fi sessions inferred by monitoring the shared channel. In the proposed scheme, LAA-enabled LTE base station fully controls the shared environment by dynamically adjusting the time allocations for both Wi-Fi and LTE technologies, while only admitting those LTE connections that should not interfere with Wi-Fi more than another Wi-Fi access point operating on the same channel would. To characterize the key performance trade-offs pertaining to the proposed operation, we develop a new analytical model. We then comprehensively investigate the performance of the developed channel sharing mechanism by confirming that it allows to achieve a high degree of fairness between the LTE and Wi-Fi connections as well as provides guarantees in terms of upper delay bound and throughput for the admitted LTE sessions. We also demonstrate that our scheme outperforms a typical LBT-based LAA implementation
The importance of the problem under investigation is to find an effective way to manage the defaults occurred in case of a project which has not enough control during the process of implementation. Usually it goes to delays, and as a consequence to it in very poor quality. The purpose of the article is to provide the project with the necessary level of control by placing control points in it. The article suggests methods for determining the places and necessity for conducting inspections during the construction period of the project. The materials of the article can be used by project managers for more efficient and qualitative management, for faster completion with the lowest possible cost in the highest possible quality.
Companies are increasingly paying close attention to the IP portfolio, which is a key competitive advantage, so patents and patent applications, as well as analysis and identification of future trends, become one of the important and strategic components of a business strategy. We argue that the problems of identifying and predicting trends or entities, as well as the search for technical features, can be solved with the help of easily accessible Big Data technologies, machine learning and predictive analytics, thereby offering an effective plan for development and progress. The purpose of this study is twofold, the first is an identification of technological trends, the second is an identification of application areas and/or that are most promising in terms of technology development and investment. The research was based on methods of clustering, processing of large text files and search queries in patent databases. The suggested approach is considered on the basis of experimental data in the field of moving connected UAVs and passive acoustic ecology control.
We present our observations of electromagnetic transients associated with GW170817/GRB 170817A using optical telescopes of Chilescope observatory and Big Scanning Antenna (BSA) of Pushchino Radio Astronomy Observatory at 110 MHz. The Chilescope observatory detected an optical transient of ∼19m on the third day in the outskirts of the galaxy NGC 4993; we continued observations following its rapid decrease. We put an upper limit of 1.5 × 104 Jy on any radio source with a duration of 10–60 s, which may be associated with GW170817/GRB 170817A. The prompt gamma-ray emission consists of two distinctive components—a hard short pulse delayed by ∼2 s with respect to the LIGO signal and softer thermal pulse with T ∼ 10 keV lasting for another ∼2 s. The appearance of a thermal component at the end of the burst is unusual for short GRBs. Both the hard and the soft components do not satisfy the Amati relation, making GRB 170817A distinctively different from other short GRBs. Based on gamma-ray and optical observations, we develop a model for the prompt high-energy emission associated with GRB 170817A. The merger of two neutron stars creates an accretion torus of ∼10‑2 M ⊙, which supplies the black hole with magnetic flux and confines the Blandford–Znajek-powered jet. We associate the hard prompt spike with the quasispherical breakout of the jet from the disk wind. As the jet plows through the wind with subrelativistic velocity, it creates a radiation-dominated shock that heats the wind material to tens of kiloelectron volts, producing the soft thermal component.