61 resultados para Reliability in refrigeration systems
Resumo:
Computational performance increasingly depends on parallelism, and many systems rely on heterogeneous resources such as GPUs and FPGAs to accelerate computationally intensive applications. However, implementations for such heterogeneous systems are often hand-crafted and optimised to one computation scenario, and it can be challenging to maintain high performance when application parameters change. In this paper, we demonstrate that machine learning can help to dynamically choose parameters for task scheduling and load-balancing based on changing characteristics of the incoming workload. We use a financial option pricing application as a case study. We propose a simulation of processing financial tasks on a heterogeneous system with GPUs and FPGAs, and show how dynamic, on-line optimisations could improve such a system. We compare on-line and batch processing algorithms, and we also consider cases with no dynamic optimisations.
Resumo:
We introduce self-interested evolutionary market agents, which act on behalf of service providers in a large decentralised system, to adaptively price their resources over time. Our agents competitively co-evolve in the live market, driving it towards the Bertrand equilibrium, the non-cooperative Nash equilibrium, at which all sellers charge their reserve price and share the market equally. We demonstrate that this outcome results in even load-balancing between the service providers. Our contribution in this paper is twofold; the use of on-line competitive co-evolution of self-interested service providers to drive a decentralised market towards equilibrium, and a demonstration that load-balancing behaviour emerges under the assumptions we describe. Unlike previous studies on this topic, all our agents are entirely self-interested; no cooperation is assumed. This makes our problem a non-trivial and more realistic one.
Resumo:
This work introduces a Gaussian variational mean-field approximation for inference in dynamical systems which can be modeled by ordinary stochastic differential equations. This new approach allows one to express the variational free energy as a functional of the marginal moments of the approximating Gaussian process. A restriction of the moment equations to piecewise polynomial functions, over time, dramatically reduces the complexity of approximate inference for stochastic differential equation models and makes it comparable to that of discrete time hidden Markov models. The algorithm is demonstrated on state and parameter estimation for nonlinear problems with up to 1000 dimensional state vectors and compares the results empirically with various well-known inference methodologies.
Resumo:
A novel framework for modelling biomolecular systems at multiple scales in space and time simultaneously is described. The atomistic molecular dynamics representation is smoothly connected with a statistical continuum hydrodynamics description. The system behaves correctly at the limits of pure molecular dynamics (hydrodynamics) and at the intermediate regimes when the atoms move partly as atomistic particles, and at the same time follow the hydrodynamic flows. The corresponding contributions are controlled by a parameter, which is defined as an arbitrary function of space and time, thus, allowing an effective separation of the atomistic 'core' and continuum 'environment'. To fill the scale gap between the atomistic and the continuum representations our special purpose computer for molecular dynamics, MDGRAPE-4, as well as GPU-based computing were used for developing the framework. These hardware developments also include interactive molecular dynamics simulations that allow intervention of the modelling through force-feedback devices.
Resumo:
We overview our recent developments in the theory of dispersion-managed (DM) solitons within the context of optical applications. First, we present a class of localized solutions with a period multiple to that of the standard DM soliton in the nonlinear Schrödinger equation with periodic variations of the dispersion. In the framework of a reduced ordinary differential equation-based model, we discuss the key features of these structures, such as a smaller energy compared to traditional DM solitons with the same temporal width. Next, we present new results on dissipative DM solitons, which occur in the context of mode-locked lasers. By means of numerical simulations and a reduced variational model of the complex Ginzburg-Landau equation, we analyze the influence of the different dissipative processes that take place in a laser.
Resumo:
We identify two different forms of diversity present in engineered collective systems, namely heterogeneity (genotypic/phenotypic diversity) and dynamics (temporal diversity). Three qualitatively different case studies are analysed, and it is shown that both forms of diversity can be beneficial in very different problem and application domains. Behavioural diversity is shown to be motivated by input diversity and this observation is used to present recommendations for designers of collective systems.
Resumo:
We present a semi-analytical method for dimensioning Reed-Solomon codes for coherent DQPSK systems with laser phase noise and cycle slips. We evaluate the accuracy of our method for a 28 Gbaud system using numerical simulations.
Resumo:
We show that by proper code design, phase noise induced cycle slips causing an error floor can be mitigated for 28 Gbaud DQPSK systems. Performance of BCH codes are investigated in terms of required overhead. © 2014 OSA.
Resumo:
In this paper a new approach to the resource allocation and scheduling mechanism that reflects the effect of user's Quality of Experience is presented. The proposed scheduling algorithm is examined in the context of 3GPP Long Term Evolution (LTE) system. Pause Intensity (PI) as an objective and no-reference quality assessment metric is employed to represent user's satisfaction in the scheduler of eNodeB. PI is in fact a measurement of discontinuity in the service. The performance of the scheduling method proposed is compared with two extreme cases: maxCI and Round Robin scheduling schemes which correspond to the efficiency and fairness oriented mechanisms, respectively. Our work reveals that the proposed method is able to perform between fairness and efficiency requirements, in favor of higher satisfaction for the users to the desired level. © VDE VERLAG GMBH.
Resumo:
Vehicle-to-Grid (V2G) system with efficient Demand Response Management (DRM) is critical to solve the problem of supplying electricity by utilizing surplus electricity available at EVs. An incentivilized DRM approach is studied to reduce the system cost and maintain the system stability. EVs are motivated with dynamic pricing determined by the group-selling based auction. In the proposed approach, a number of aggregators sit on the first level auction responsible to communicate with a group of EVs. EVs as bidders consider Quality of Energy (QoE) requirements and report interests and decisions on the bidding process coordinated by the associated aggregator. Auction winners are determined based on the bidding prices and the amount of electricity sold by the EV bidders. We investigate the impact of the proposed mechanism on the system performance with maximum feedback power constraints of aggregators. The designed mechanism is proven to have essential economic properties. Simulation results indicate the proposed mechanism can reduce the system cost and offer EVs significant incentives to participate in the V2G DRM operation.
Resumo:
In oscillatory reaction-diffusion systems, time-delay feedback can lead to the instability of uniform oscillations with respect to formation of standing waves. Here, we investigate how the presence of additive, Gaussian white noise can induce the appearance of standing waves. Combining analytical solutions of the model with spatio-temporal simulations, we find that noise can promote standing waves in regimes where the deterministic uniform oscillatory modes are stabilized. As the deterministic phase boundary is approached, the spatio-temporal correlations become stronger, such that even small noise can induce standing waves in this parameter regime. With larger noise strengths, standing waves could be induced at finite distances from the (deterministic) phase boundary. The overall dynamics is defined through the interplay of noisy forcing with the inherent reaction-diffusion dynamics.
Resumo:
One of the reasons for using variability in the software product line (SPL) approach (see Apel et al., 2006; Figueiredo et al., 2008; Kastner et al., 2007; Mezini & Ostermann, 2004) is to delay a design decision (Svahnberg et al., 2005). Instead of deciding on what system to develop in advance, with the SPL approach a set of components and a reference architecture are specified and implemented (during domain engineering, see Czarnecki & Eisenecker, 2000) out of which individual systems are composed at a later stage (during application engineering, see Czarnecki & Eisenecker, 2000). By postponing the design decisions in such a manner, it is possible to better fit the resultant system in its intended environment, for instance, to allow selection of the system interaction mode to be made after the customers have purchased particular hardware, such as a PDA vs. a laptop. Such variability is expressed through variation points which are locations in a software-based system where choices are available for defining a specific instance of a system (Svahnberg et al., 2005). Until recently it had sufficed to postpone committing to a specific system instance till before the system runtime. However, in the recent years the use and expectations of software systems in human society has undergone significant changes.Today's software systems need to be always available, highly interactive, and able to continuously adapt according to the varying environment conditions, user characteristics and characteristics of other systems that interact with them. Such systems, called adaptive systems, are expected to be long-lived and able to undertake adaptations with little or no human intervention (Cheng et al., 2009). Therefore, the variability now needs to be present also at system runtime, which leads to the emergence of a new type of system: adaptive systems with dynamic variability.
Resumo:
The performance of most operations systems is significantly affected by the interaction of human decision-makers. A methodology, based on the use of visual interactive simulation (VIS) and artificial intelligence (AI), is described that aims to identify and improve human decision-making in operations systems. The methodology, known as 'knowledge-based improvement' (KBI), elicits knowledge from a decision-maker via a VIS and then uses AI methods to represent decision-making. By linking the VIS and AI representation, it is possible to predict the performance of the operations system under different decision-making strategies and to search for improved strategies. The KBI methodology is applied to the decision-making surrounding unplanned maintenance operations at a Ford Motor Company engine assembly plant.
Resumo:
In May 2006, the Ministers of Health of all the countries on the African continent, at a special session of the African Union, undertook to institutionalise efficiency monitoring within their respective national health information management systems. The specific objectives of this study were: (i) to assess the technical efficiency of National Health Systems (NHSs) of African countries for measuring male and female life expectancies, and (ii) to assess changes in health productivity over time with a view to analysing changes in efficiency and changes in technology. The analysis was based on a five-year panel data (1999-2003) from all the 53 countries of continental Africa. Data Envelopment Analysis (DEA) - a non-parametric linear programming approach - was employed to assess the technical efficiency. Malmquist Total Factor Productivity (MTFP) was used to analyse efficiency and productivity change over time among the 53 countries' national health systems. The data consisted of two outputs (male and female life expectancies) and two inputs (per capital total health expenditure and adult literacy). The DEA revealed that 49 (92.5%) countries' NHSs were run inefficiently in 1999 and 2000; 50 (94.3%), 48 (90.6%) and 47 (88.7%) operated inefficiently in 2001, 2002, and 2003 respectively. All the 53 countries' national health systems registered improvements in total factor productivity attributable mainly to technical progress. Fifty-two countries did not experience any change in scale efficiency, while thirty (56.6%) countries' national health systems had a Pure Efficiency Change (PEFFCH) index of less than one, signifying that those countries' NHSs pure efficiency contributed negatively to productivity change. All the 53 countries' national health systems registered improvements in total factor productivity, attributable mainly to technical progress. Over half of the countries' national health systems had a pure efficiency index of less than one, signifying that those countries' NHSs pure efficiency contributed negatively to productivity change. African countries may need to critically evaluate the utility of institutionalising Malmquist TFP type of analyses to monitor changes in health systems economic efficiency and productivity over time. African national health systems, per capita total health expenditure, technical efficiency, scale efficiency, Malmquist indices of productivity change, DEA
Resumo:
This thesis is concerned with approximate inference in dynamical systems, from a variational Bayesian perspective. When modelling real world dynamical systems, stochastic differential equations appear as a natural choice, mainly because of their ability to model the noise of the system by adding a variant of some stochastic process to the deterministic dynamics. Hence, inference in such processes has drawn much attention. Here two new extended frameworks are derived and presented that are based on basis function expansions and local polynomial approximations of a recently proposed variational Bayesian algorithm. It is shown that the new extensions converge to the original variational algorithm and can be used for state estimation (smoothing). However, the main focus is on estimating the (hyper-) parameters of these systems (i.e. drift parameters and diffusion coefficients). The new methods are numerically validated on a range of different systems which vary in dimensionality and non-linearity. These are the Ornstein-Uhlenbeck process, for which the exact likelihood can be computed analytically, the univariate and highly non-linear, stochastic double well and the multivariate chaotic stochastic Lorenz '63 (3-dimensional model). The algorithms are also applied to the 40 dimensional stochastic Lorenz '96 system. In this investigation these new approaches are compared with a variety of other well known methods such as the ensemble Kalman filter / smoother, a hybrid Monte Carlo sampler, the dual unscented Kalman filter (for jointly estimating the systems states and model parameters) and full weak-constraint 4D-Var. Empirical analysis of their asymptotic behaviour as a function of observation density or length of time window increases is provided.