Phone/Fax: (495) 771-32-38
33 Kirpichnaya Ulitsa
School Head — Svetlana Maltseva
Deputy Head of Research and Partnerships — Vasily Kornilov
Deputy Head for Prospective Student and Alumni Affairs — Vladimir Samodurov
Deputy Head for Academics — Boris Poklonov
Deputy Head for International Relations — Michael Komarov
A fridge plays an important role in the kitchen in comparison to other appliances because it helps to store food products at optimal conditions for a long period of time. The ordinary refrigerators perfectly allow preserving meals but they are not effective in case of food management. Providing a remote control for home appliances extends the everyday usage of these devices. In addition to the remote control device, some manufacturers use additional modules such as internal cameras and hands-free speaker for convenient control of an appliance. All these devices are able to communicate with each other to reach common goals. The home appliance producer Liebherr in cooperation with technology company Microsoft developed a solution for remote control of refrigeration with possibility of food recognition using Machine Learning algorithms. This option enables automatic compiling of the list of food stored in the fridge and food ordering in an online shop without manual actions. This opportunity enables not only a convenient usage of an appliance but also allows reduction of electricity consumption because user does not open fridge doors frequently as far as he knows a list of food in refrigerator. In this paper we describe SmartDevice technology from Liebherr that was developed for adding smart features to the brand products. In particular, we review main business processes of SmartDevice, discuss advantages and disadvantages of this solution for the end customers and identify future research for creating smart fridges.
We consider interior and exterior initial boundary value problems for the three-dimensional
wave (d’Alembert) equation. First, we reduce a given problem to an equivalent operator
equation with respect to unknown sources deﬁned only at the boundary of the original
domain. In doing so, the Huygens’ principle enables us to obtain the operator equation
in a form that involves only ﬁnite and non-increasing pre-history of the solution in time.
Next, we discretize the resulting boundary equation and solve it eﬃciently by the method
of difference potentials (MDP). The overall numerical algorithm handles boundaries of
general shape using regular structured grids with no deterioration of accuracy. For long
simulation times it offers sub-linear complexity with respect to the grid dimension, i.e., is
asymptotically cheaper than the cost of a typical explicit scheme. In addition, our algorithm
allows one to share the computational cost between multiple similar problems. On multi-
processor (multi-core) platforms, it beneﬁts from what can be considered an effective
parallelization in time.
Researchers face fundamental challenges applying the stochastic geometry framework to analysis of terahertz (THz) communications systems. The two major problems are the principally new propagation model that now includes exponential term responsible for molecular absorption and blocking of THz radiation by the human crowd around the receiver. These phenomena change the probability density function (pdf) of the interference from a single node such that it no longer has an analytical Laplace transform (LT) preventing characterization of the aggregated interference and signal-to-interference ratio (SIR) distributions. The expected use of highly directional antennas at both transmitter and receiver adds to this problem increasing the complexity of modeling efforts. In this paper, we consider Poisson deployment of interferers in ℜ 2 and provide accurate analytical approximations for pdf of interference from a randomly chosen node for blocking and non-blocking cases. We then derive LTs of pdfs of aggregated interference and SIR. Using the Talbot’s algorithm for inverse transform we provide numerical results indicating that failure to capture atmospheric absorption, blocking or antenna directivity leads to significant modeling errors. Finally, we investigate the response of SIR densities to a wide range of system parameters highlighting the specific effects of THz communications systems. The model developed in this paper can be used as a building block for performance analysis of realistic THz network deployments providing metrics such as outage and coverage probabilities.
In the 1960s, the so-called “software crisis” triggered the advent of software engineering as a discipline. The idea was to apply the engineering methods of material production to the new domain of large-scale concurrent software systems in order to make the software projects more accurate and predictable. This software engineering approach was feasible, though the methods and practices used had to differ substantially from those used in the material production. The focus of the software engineering discipline was the “serial” production of substantially large-scale, complex and high quality software systems. Researchers argue whether the crisis in software engineering is over yet. The software crisis originates from a number of factors; these are human-related and technology-related factors. To manage this crisis, the authors suggest a set of software engineering methods, which systematically optimize the lifecycles for both types of these influencing factors. This lifecycle optimization strategy includes crisis-responsive methodologies, system-level architectural patterns, informing process frameworks, and a set of knowledge transfer principles. Software development usually involves customers, developers and their management; each of these parties has different preferences and expectations. These parties often differ in their vision of the resulting product; typically, the customers focus on business value while the developers are concerned with technological aspects. Such a difference in focus often results in crises. Thus, the software crises often have a human factor-related root cause. To deal with these kind of crises, software engineers should enhance their skillset with managerial skills, such as teamwork, communications, negotiations, and risk management.
The importance of the problem under investigation is to find an effective way to manage the defaults occurred in case of a project which has not enough control during the process of implementation. Usually it goes to delays, and as a consequence to it in very poor quality. The purpose of the article is to provide the project with the necessary level of control by placing control points in it. The article suggests methods for determining the places and necessity for conducting inspections during the construction period of the project. The materials of the article can be used by project managers for more efficient and qualitative management, for faster completion with the lowest possible cost in the highest possible quality.
We present our observations of electromagnetic transients associated with GW170817/GRB 170817A using optical telescopes of Chilescope observatory and Big Scanning Antenna (BSA) of Pushchino Radio Astronomy Observatory at 110 MHz. The Chilescope observatory detected an optical transient of ∼19m on the third day in the outskirts of the galaxy NGC 4993; we continued observations following its rapid decrease. We put an upper limit of 1.5 × 104 Jy on any radio source with a duration of 10–60 s, which may be associated with GW170817/GRB 170817A. The prompt gamma-ray emission consists of two distinctive components—a hard short pulse delayed by ∼2 s with respect to the LIGO signal and softer thermal pulse with T ∼ 10 keV lasting for another ∼2 s. The appearance of a thermal component at the end of the burst is unusual for short GRBs. Both the hard and the soft components do not satisfy the Amati relation, making GRB 170817A distinctively different from other short GRBs. Based on gamma-ray and optical observations, we develop a model for the prompt high-energy emission associated with GRB 170817A. The merger of two neutron stars creates an accretion torus of ∼10‑2 M ⊙, which supplies the black hole with magnetic flux and confines the Blandford–Znajek-powered jet. We associate the hard prompt spike with the quasispherical breakout of the jet from the disk wind. As the jet plows through the wind with subrelativistic velocity, it creates a radiation-dominated shock that heats the wind material to tens of kiloelectron volts, producing the soft thermal component.
This book discusses smart, agile software development methods and their applications for enterprise crisis management, presenting a systematic approach that promotes agility and crisis management in software engineering. The key finding is that these crises are caused by both technology-based and human-related factors. Being mission-critical, human-related issues are often neglected. To manage the crises, the book suggests an efficient agile methodology including a set of models, methods, patterns, practices and tools. Together, these make a survival toolkit for large-scale software development in crises. Further, the book analyses lifecycles and methodologies focusing on their impact on the project timeline and budget, and incorporates a set of industry-based patterns, practices and case studies, combining academic concepts and practices of software engineering.
In the process of astronomical observations collected vast amounts of data. BSA (Big Scanning Antenna) LPI used in the study of impulse phenomena, daily logs 87.5 GB of data (32 TB per year). These data have important implications for both short-and long-term monitoring of various classes of radio sources (including radio transients of different nature), monitoring the Earth's ionosphere, the interplanetary and the interstellar plasma, the search and monitoring of different classes of radio sources. In the framework of the studies discovered 83096 individual pulse events (in the interval of the study highlighted July 2012 - October 2013), which may correspond to pulsars, twinkling springs, and a rapid radio transients. Detected impulse events are supposed to be used to filter subsequent observations. The study suggests approach, using the creation of the multilayered artificial neural network, which processes the input raw data and after processing, by the hidden layer, the output layer produces a class of impulsive phenomena.
System design where cyber-physical applications are securely coordinated from the cloud may simplify the development process. However, all private data are then pushed to these remote “swamps,” and human users lose actual control as compared to when the applications are executed directly on their devices. At the same time, computing at the network edge is still lacking support for such straightforward multidevice development, which is essential for a wide range of dynamic cyber-physical services. This article proposes a novel programming model as well as contributes the associated secure-connectivity framework for leveraging safe coordinated device proximity as an additional degree of freedom between the remote cloud and the safety-critical network edge, especially under uncertain environment constraints. This article is part of a special issue on Software Safety and Security Risk Mitigation in Cyber-physical Systems.
Sustaining a competitive edge in today’s business world requires innovative approaches to product, service, and management systems design and performance. Advances in computing technologies have presented managers with additional challenges as well as further opportunities to enhance their business models.
Software Engineering for Enterprise System Agility: Emerging Research and Opportunities is a collection of innovative research that identifies the critical technological and management factors in ensuring the agility of business systems and investigates process improvement and optimization through software development. Featuring coverage on a broad range of topics such as business architecture, cloud computing, and agility patterns, this publication is ideally designed for business managers, business professionals, software developers, academicians, researchers, and upper-level students interested in current research on strategies for improving the flexibility and agility of businesses and their systems.
Earth's global magnetic field generated by an internal dynamo mechanism has been continuously changing on different time scales since its formation. Paleodata indicate that relatively long periods of evolutionary changes can be replaced by quick magnetic inversions. Based on observations, Earth's magnetic field is currently weakening and the magnetic poles are shifting, possibly indicating the beginning of the inversion process. This paper invokes Gauss coefficients to approximate the behavior of Earth's magnetic field components over the past 100 years. Using the extrapolation method, it is estimated that the magnetic dipole component will vanish by the year 3600 and at that time the geomagnetic field will be determined by a smaller value of a quadrupole magnetic component. A numerical model is constructed which allows evaluating and comparing both galactic and solar cosmic ray fluxes in Earth's magnetosphere and on its surface during periods of dipole or quadrupole domination. The role of the atmosphere in absorbing particles of cosmic rays is taken into account. An estimate of the radiation danger to humans is obtained for the ground level and for the International Space Station altitude of km. It is shown that in the most unfavorable, minimum field interval of the inversion process, the galactic cosmic ray flux increases by no more than a factor of three, implying that the radiation danger does not exceed the maximum permissible dose. Thus, the danger of magnetic inversion periods generally should not have fatal consequences for humans and nature as a whole, despite dramatically changing the structure of Earth's magnetosphere.
The current work is devoted to study of interrelations of the obtained time series by means of econometric and wavelet analysis. At the first stage of this study, econometric analysis was conducted, regression was constructed. In the regression influence of the number of nomads and the amount of resource on the number of plowmen was studied. The coefficient of determination (R2) of the constructed regression turned out to be 0.81, the Durbin-Watson statistics equals to 0.94, which indicates the presence of positive first-order autocorrelation of errors. The next stage is an analysis based on wavelet transforms, which helps to get rid of high-frequency "noise" and interference in considered time series. Within the framework of this paper, the Haar wavelet and the Daubechies 2 tap wavelet were considered (the remaining wavelets give similar results). After the time series had been cleared by the wavelet analysis, regression analysis was applied again. The coefficient of determination of new regressions depending on which wavelet was applied and the interference of what frequency were removed took values in the range from 0.86 to 0.93. The coefficient of determination of new regressions depends on which wavelet was applied and the interference of what frequency were removed. It takes values in the range from 0.86 to 0.93. However, the Durbin-Watson statistics decreased its values and began to take values in the range from 0.01 to 0.46, which still indicates the presence of positive first-order autocorrelation of errors. In the end, we learn that in this situation, the application of wavelet analysis significantly increases the explanatory power of regression, on the other hand, the problem of autocorrelation of errors can not be resolved in this way, in some sense it is only getting worse.
The article deals with the application of information technology in the calculation of fuel consumption rates for the operation of vehicles in supply chains. It is shown that as a result, the labor intensity of operations performed by the dispatching personnel is reduced. The validity of the norms is increasing, which ultimately leads to a decrease in logistics costs.
The basic aim of this paper was to analyze and make a forecast for any changes on the market of top-level domain names by the results of the program implementation for introducing new domains (new gTLD) by ICANN. New domain names registration statistics is presented in this paper for the end of 2016. New criteria were proposed to describe changes and real usage of new domain names. We also studied users knowledge about information resources located within new domain names. Positive and negative aspects of implementation of new gTLD are presented in this paper.