Phone/Fax: (495) 771-32-38
33 Kirpichnaya Ulitsa
School Head — Svetlana Maltseva
Deputy Head of Research and Partnerships — Vasily Kornilov
Deputy Head for Prospective Student and Alumni Affairs — Vladimir Samodurov
Deputy Head for Academics — Boris Poklonov
Deputy Head for International Relations — Michael Komarov
We present our observations of electromagnetic transients associated with GW170817/GRB 170817A using optical telescopes of Chilescope observatory and Big Scanning Antenna (BSA) of Pushchino Radio Astronomy Observatory at 110 MHz. The Chilescope observatory detected an optical transient of ∼19m on the third day in the outskirts of the galaxy NGC 4993; we continued observations following its rapid decrease. We put an upper limit of 1.5 × 104 Jy on any radio source with a duration of 10–60 s, which may be associated with GW170817/GRB 170817A. The prompt gamma-ray emission consists of two distinctive components—a hard short pulse delayed by ∼2 s with respect to the LIGO signal and softer thermal pulse with T ∼ 10 keV lasting for another ∼2 s. The appearance of a thermal component at the end of the burst is unusual for short GRBs. Both the hard and the soft components do not satisfy the Amati relation, making GRB 170817A distinctively different from other short GRBs. Based on gamma-ray and optical observations, we develop a model for the prompt high-energy emission associated with GRB 170817A. The merger of two neutron stars creates an accretion torus of ∼10‑2 M ⊙, which supplies the black hole with magnetic flux and confines the Blandford–Znajek-powered jet. We associate the hard prompt spike with the quasispherical breakout of the jet from the disk wind. As the jet plows through the wind with subrelativistic velocity, it creates a radiation-dominated shock that heats the wind material to tens of kiloelectron volts, producing the soft thermal component.
In the process of astronomical observations collected vast amounts of data. BSA (Big Scanning Antenna) LPI used in the study of impulse phenomena, daily logs 87.5 GB of data (32 TB per year). These data have important implications for both short-and long-term monitoring of various classes of radio sources (including radio transients of different nature), monitoring the Earth's ionosphere, the interplanetary and the interstellar plasma, the search and monitoring of different classes of radio sources. In the framework of the studies discovered 83096 individual pulse events (in the interval of the study highlighted July 2012 - October 2013), which may correspond to pulsars, twinkling springs, and a rapid radio transients. Detected impulse events are supposed to be used to filter subsequent observations. The study suggests approach, using the creation of the multilayered artificial neural network, which processes the input raw data and after processing, by the hidden layer, the output layer produces a class of impulsive phenomena.
The basic aim of this paper was to analyze and make a forecast for any changes on the market of top-level domain names by the results of the program implementation for introducing new domains (new gTLD) by ICANN. New domain names registration statistics is presented in this paper for the end of 2016. New criteria were proposed to describe changes and real usage of new domain names. We also studied users knowledge about information resources located within new domain names. Positive and negative aspects of implementation of new gTLD are presented in this paper.
The generality of synergetic principles of the processes of autowave self-organization in active medium makes it possible to apply the model, which describes the evolving physicochemical and biophysical systems and is based on the modified system of Fitz-Hue-Nagumo equations, to describe the spatiotemporal behavior of the stock market with its most used pattern such as propagating Elliott waves.
The ecological modernisation of enterprises has led to significant levels of total emissions in the atmosphere. This is a very important and complex issue for the Republic of Armenia (RA). An agent-based model was developed to determine the best trade-offs for the ecological modernisation of enterprises. The aim is to solve the bi-objective optimisation problem, the objectives of which are the ‘Integrated Volume of Total Emissions’ and the ‘Integrated Index of Industrial Production’. The results indicate that it is possible to reduce the total emissions in the atmosphere by more than 20% for a ten-year period. This may be done by keeping up the positive dynamics of industrial production through choosing trade-offs in the frame of the ‘Pareto-optimal ecological modernisation’ scenario. The scenario was obtained with the help of the suggested genetic algorithm, modified for the problem of the binary control of transitions from initial non-ecological states of each enterprise, towards the target state of ecologically pure manufacturing.
The proposed model is intended to assessment of company's operation effectiveness, which is an important factor at investment decisions making. There are compared indicators of the growth rate, profitability and risk for shares placed on various stock exchanges with an assessment of the intrinsic value and management efficiency of company. The information received is useful for investors and company managers for operating on stock markets.
This work deals with investment decision in downstream, and as is wellknown no two refineries are exactly alike, even if they are owned by the same company(Cheremisinoff 2001). Each was designed with a combination of several technologies to meet requirements (market opportunities, availabilities, financial capability, environmental realities) (Energy 2009). One of the most commonly used methods for the comparison of refineries is the comparison by the single technical and economic indicator, the so-called “Nelson Index” (Johnston 1996) (hereinafter referred to as the NCI), which shows the complexity of the equipment installed in relation to the primary distillation process. The NCI index indicates not only the intensity of the investment or index value of the plant but also its potential for added value. Thus, the higher the NCI index, the higher the cost of the oil refinery, and the higher the quality and level of its products.
Researchers face fundamental challenges applying the stochastic geometry framework to analysis of terahertz (THz) communications systems. The two major problems are the principally new propagation model that now includes exponential term responsible for molecular absorption and blocking of THz radiation by the human crowd around the receiver. These phenomena change the probability density function (pdf) of the interference from a single node such that it no longer has an analytical Laplace transform (LT) preventing characterization of the aggregated interference and signal-to-interference ratio (SIR) distributions. The expected use of highly directional antennas at both transmitter and receiver adds to this problem increasing the complexity of modeling efforts. In this paper, we consider Poisson deployment of interferers in ℜ 2 and provide accurate analytical approximations for pdf of interference from a randomly chosen node for blocking and non-blocking cases. We then derive LTs of pdfs of aggregated interference and SIR. Using the Talbot’s algorithm for inverse transform we provide numerical results indicating that failure to capture atmospheric absorption, blocking or antenna directivity leads to significant modeling errors. Finally, we investigate the response of SIR densities to a wide range of system parameters highlighting the specific effects of THz communications systems. The model developed in this paper can be used as a building block for performance analysis of realistic THz network deployments providing metrics such as outage and coverage probabilities.
The article contains information on the architecture of integrative information and communication system developed based on the results of space activities for the development of digital economy in the agro-industrial complex of the Russian Federation. The article provides a review of the development stages of information and analytical solutions in Russian agriculture. There is also an approach for the introduction of modern ICT in order to ensure an intensive development of the agro-industrial complex in Russia within the digital economy.
We use Cluster and THEMIS simultaneous observations to study the spatial distributions of a shear BY field in the Plasma Sheet (PS) of the Earth's magnetotail at 31 RE < X < 9 RE. The best correlation between the BY field in the PS (BY_PS) and the Y-component of the Interplanetary Magnetic Field (IMF) (BY_IMF) was observed during the quiet PS periods when high speed plasma flows were not detected. During active PS periods the correlation between the BY_PS and BY_ IMF was poor. The analysis of spatial distribution of the BY field along the direction perpendicular to the Current Sheet (CS) plane showed the presence of one of the following configurations, which can be self-consistently generated in the CS: 1) the “quadrupole” distribution of the BY field usually associated with the Hall current system in the vicinity of X-line and 2) the symmetrical “bell-shaped” distribution formed due to the BY amplification near the neutral plane of the CS. Multipoint observations revealed the transient appearance of the “quadrupole” BY distribution during the periods of X-line formation in the mid-tail. This distribution was observed during a few minutes within, at least, 12 RE from the estimated X-line position. On the contrary, the symmetrical “bell-shaped” distribution is more localized in the radial direction and, generally, has a larger observation time (up to ~10 min). Thus, the internal CS perturbations caused either by the Hall currents related to reconnection or by the peculiarities of the local quasi-adiabatic ion dynamics sufficiently affect the shear BY field existing in the magnetotail due to the partial IMF penetration.
In universities and technical colleges with relevant IT qualifications in one semester multiple streams, courses and specializations can use software products for training purposes. IT services of universities should deal with the challenge of creating the infrastructure of educational applications that can support the educational process. We note that the number of specializations which study information technology are growing every year (for example, in HSE there are disciplines-minors, which can enroll students coming from any field). Also in the recent years, online courses have started to become popular. If the load is not planned ahead taking into account future trends, the power of even the most high-tech infrastructure will be insufficient. Calculation of the corresponding load on the infrastructure must be made in the planning process of the disciplines, so that we can reserve appropriate facilities, and thus organize an effective learning process.
Software developers use a variety of benchmarking tools that are complex and do not provide the necessary information for the participants of educational process planning.
This article discusses the construction of a simulation model that supports the educational process planning. The simulation is carried out using the capabilities of the tool AnyLogic 7. The aim of this work is to develop a simulation model designed to estimate the load on the information system used in the educational process. In addition, besides the description of the model, the article presents the results of calculations used for various options of the information system (private cloud or on a server at the university). The simulation results were confirmed by data obtained during practical classes at the university. This model gives us the opportunity to plan the educational process in order to achieve uniformity of the load on the services. If necessary, the model allows us to make a decision about the location of the educational information system: on servers of the university or in a private cloud.
It’s common knowledge that today’s education is becoming more open and easily accessible; consequently, it is not limited by the boundaries of countries and regions. Moreover, online communication allows the educational processes to transpire irrespective of the territorial boundaries: not only the number of students is growing, but also their cultural identities are becoming more diverse. Nowadays are facing new problems caused by different world views, specific types of educational discourse, various information processing strategies etc. This book describes the prerequisites for development in the area of crosscultural didactics. This approach is based on research studies of differences between mentalities, ways of working with educational information, culturally-specific teaching methods and teaching techniques that determine differentiated approaches to the choice of multimedia technologies in education system. Cross-cultural multimedia didactics may be viewed as a combination of cultural, psychological and pedagogical aspects, of culture specific pedagogical discourse, unique features of ergonomic design of educational resources, cognitive and pragmatic features and specific methods and forms of teaching and, therefore, is set to become one of the most important trends in contemporary education system.
This book will be of interest not only to professional, who work in modern cross-cultural education environment, but also to a wide range of readers interested in cross-cultural communication.
High performance querying and ad-hoc querying are commonly viewed as mutually exclusive goals in massively parallel processing databases. Furthermore, there is a contradiction between ease of extending the data model and ease of analysis. The modern 'Data Lake' approach, promises extreme ease of adding new data to a data model, however it is prone to eventually becoming a Data Swamp - unstructured, ungoverned, and out of control Data Lake where due to a lack of process, standards and governance, data is hard to find, hard to use and is consumed out of context. This paper introduces a novel technique, highly normalized Big Data using Anchor modeling, that provides a very efficient way to store information and utilize resources, thereby providing ad-hoc querying with high performance for the first time in massively parallel processing databases. This technique is almost as convenient for expanding data model as a Data Lake, while it is internally protected from transforming to Data Swamp. A case study of how this approach is used for a Data Warehouse at Avito over a three-year period, with estimates for and results of real data experiments carried out in HP Vertica, an MPP RDBMS, is also presented. This paper is an extension of theses from The 34th International Conference on Conceptual Modeling (ER 2015) (Golov and Rönnbäck 2015) , it is complemented with numerical results about key operating areas of highly normalized big data warehouse, collected over several (1-3) years of commercial operation. Also, the limitations, imposed by using a single MPP database cluster, are described, and cluster fragmentation approach is proposed.
In 2016 a survey was conducted among Russian companies to discover the most common problems associated with flexibility of business process management. A gap between strict process formalization demands and unpredictable nature of many knowledge-intensive operations was identified. The article suggests an approach to facilitate process management via combined context-aware set of methods. Firstly, the key terms are selected to serve as special cause indicators of variation in a process instance based on risk profiles. Afterward a cloud service is called, which automatically analyzes semantic annotation of the concrete process instance. Risk detection service identifies potential operational risks and in case of unexpected process execution complexities notifies users. Finally, expert search service calls for an expert in an organization automatically to create expert community. This novel approach could be used for knowledge-intensive business sectors (such as Research and Development) or in any organization interested in increasing its agility in changing business environment.