932 resultados para Markov chains. Convergence. Evolutionary Strategy. Large Deviations
Resumo:
In a monetary union, national fiscal deficits are of limited help to counteract deep recessions; union-wide support is needed. A common euro-area budget (1) should provide a temporary but significant transfer of resources in case of large regional shocks, (2) would be an instrument to counteract severe recessions in the area as a whole, and (3) would ensure financial stability. The four main options for stabilisation of regional shocks to the euro area are: unemployment insurance, payments related to deviations of output from potential, the narrowing of large spreads, and discretionary spending. The common resource would need to be well-designed to be distributionally neutral, avoid free-riding behaviour and foster structural change while be of sufficient size to have an impact. Linking budget support to large deviations of output from potential appears to be the best option. A borrowing capacity equipped with a structural balanced budget rule could address area-wide shocks. It could serve as the fiscal backstop to the bank resolution authority. Resources amounting to 2 percent of euro-area GDP would be needed for stabilisation policy and financial stability.
Resumo:
Numerous techniques exist which can be used for the task of behavioural analysis and recognition. Common amongst these are Bayesian networks and Hidden Markov Models. Although these techniques are extremely powerful and well developed, both have important limitations. By fusing these techniques together to form Bayes-Markov chains, the advantages of both techniques can be preserved, while reducing their limitations. The Bayes-Markov technique forms the basis of a common, flexible framework for supplementing Markov chains with additional features. This results in improved user output, and aids in the rapid development of flexible and efficient behaviour recognition systems.
Resumo:
In this paper we consider hybrid (fast stochastic approximation and deterministic refinement) algorithms for Matrix Inversion (MI) and Solving Systems of Linear Equations (SLAE). Monte Carlo methods are used for the stochastic approximation, since it is known that they are very efficient in finding a quick rough approximation of the element or a row of the inverse matrix or finding a component of the solution vector. We show how the stochastic approximation of the MI can be combined with a deterministic refinement procedure to obtain MI with the required precision and further solve the SLAE using MI. We employ a splitting A = D – C of a given non-singular matrix A, where D is a diagonal dominant matrix and matrix C is a diagonal matrix. In our algorithm for solving SLAE and MI different choices of D can be considered in order to control the norm of matrix T = D –1C, of the resulting SLAE and to minimize the number of the Markov Chains required to reach given precision. Further we run the algorithms on a mini-Grid and investigate their efficiency depending on the granularity. Corresponding experimental results are presented.
Resumo:
In Sweden, there are about 0.5 million single-family houses that are heated by electricity alone, and rising electricity costs force the conversion to other heating sources such as heat pumps and wood pellet heating systems. Pellet heating systems for single-family houses are currently a strongly growing market. Future lack of wood fuels is possible even in Sweden, and combining wood pellet heating with solar heating will help to save the bio-fuel resources. The objectives of this thesis are to investigate how the electrically heated single-family houses can be converted to pellet and solar heating systems, and how the annual efficiency and solar gains can be increased in such systems. The possible reduction of CO-emissions by combining pellet heating with solar heating has also been investigated. Systems with pellet stoves (both with and without a water jacket), pellet boilers and solar heating have been simulated. Different system concepts have been compared in order to investigate the most promising solutions. Modifications in system design and control strategies have been carried out in order to increase the system efficiency and the solar gains. Possibilities for increasing the solar gains have been limited to investigation of DHW-units for hot water production and the use of hot water for heating of dishwashers and washing machines via a heat exchanger instead of electricity (heat-fed appliances). Computer models of pellet stoves, boilers, DHW-units and heat-fed appliances have been developed and the parameters for the models have been identified from measurements on real components. The conformity between the models and the measurements has been checked. The systems with wood pellet stoves have been simulated in three different multi-zone buildings, simulated in detail with heat distribution through door openings between the zones. For the other simulations, either a single-zone house model or a load file has been used. Simulations were carried out for Stockholm, Sweden, but for the simulations with heat-fed machines also for Miami, USA. The foremost result of this thesis is the increased understanding of the dynamic operation of combined pellet and solar heating systems for single-family houses. The results show that electricity savings and annual system efficiency is strongly affected by the system design and the control strategy. Large reductions in pellet consumption are possible by combining pellet boilers with solar heating (a reduction larger than the solar gains if the system is properly designed). In addition, large reductions in carbon monoxide emissions are possible. To achieve these reductions it is required that the hot water production and the connection of the radiator circuit is moved to a well insulated, solar heated buffer store so that the boiler can be turned off during the periods when the solar collectors cover the heating demand. The amount of electricity replaced using systems with pellet stoves is very dependant on the house plan, the system design, if internal doors are open or closed and the comfort requirements. Proper system design and control strategies are crucial to obtain high electricity savings and high comfort with pellet stove systems. The investigated technologies for increasing the solar gains (DHW-units and heat-fed appliances) significantly increase the solar gains, but for the heat-fed appliances the market introduction is difficult due to the limited financial savings and the need for a new heat distribution system. The applications closest to market introduction could be for communal laundries and for use in sunny climates where the dominating part of the heat can be covered by solar heating. The DHW-unit is economical but competes with the internal finned-tube heat exchanger which is the totally dominating technology for hot water preparation in solar combisystems for single-family houses.
Resumo:
The idea of considering imprecision in probabilities is old, beginning with the Booles George work, who in 1854 wanted to reconcile the classical logic, which allows the modeling of complete ignorance, with probabilities. In 1921, John Maynard Keynes in his book made explicit use of intervals to represent the imprecision in probabilities. But only from the work ofWalley in 1991 that were established principles that should be respected by a probability theory that deals with inaccuracies. With the emergence of the theory of fuzzy sets by Lotfi Zadeh in 1965, there is another way of dealing with uncertainty and imprecision of concepts. Quickly, they began to propose several ways to consider the ideas of Zadeh in probabilities, to deal with inaccuracies, either in the events associated with the probabilities or in the values of probabilities. In particular, James Buckley, from 2003 begins to develop a probability theory in which the fuzzy values of the probabilities are fuzzy numbers. This fuzzy probability, follows analogous principles to Walley imprecise probabilities. On the other hand, the uses of real numbers between 0 and 1 as truth degrees, as originally proposed by Zadeh, has the drawback to use very precise values for dealing with uncertainties (as one can distinguish a fairly element satisfies a property with a 0.423 level of something that meets with grade 0.424?). This motivated the development of several extensions of fuzzy set theory which includes some kind of inaccuracy. This work consider the Krassimir Atanassov extension proposed in 1983, which add an extra degree of uncertainty to model the moment of hesitation to assign the membership degree, and therefore a value indicate the degree to which the object belongs to the set while the other, the degree to which it not belongs to the set. In the Zadeh fuzzy set theory, this non membership degree is, by default, the complement of the membership degree. Thus, in this approach the non-membership degree is somehow independent of the membership degree, and this difference between the non-membership degree and the complement of the membership degree reveals the hesitation at the moment to assign a membership degree. This new extension today is called of Atanassov s intuitionistic fuzzy sets theory. It is worth noting that the term intuitionistic here has no relation to the term intuitionistic as known in the context of intuitionistic logic. In this work, will be developed two proposals for interval probability: the restricted interval probability and the unrestricted interval probability, are also introduced two notions of fuzzy probability: the constrained fuzzy probability and the unconstrained fuzzy probability and will eventually be introduced two notions of intuitionistic fuzzy probability: the restricted intuitionistic fuzzy probability and the unrestricted intuitionistic fuzzy probability
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Analytical and Monte Carlo approaches to evaluate probability distributions of interruption duration
Resumo:
Regulatory authorities in many countries, in order to maintain an acceptable balance between appropriate customer service qualities and costs, are introducing a performance-based regulation. These regulations impose penalties-and, in some cases, rewards-that introduce a component of financial risk to an electric power utility due to the uncertainty associated with preserving a specific level of system reliability. In Brazil, for instance, one of the reliability indices receiving special attention by the utilities is the maximum continuous interruption duration (MCID) per customer.This parameter is responsible for the majority of penalties in many electric distribution utilities. This paper describes analytical and Monte Carlo simulation approaches to evaluate probability distributions of interruption duration indices. More emphasis will be given to the development of an analytical method to assess the probability distribution associated with the parameter MCID and the correspond ng penalties. Case studies on a simple distribution network and on a real Brazilian distribution system are presented and discussed.
Resumo:
Recent studies have shown that adaptive X control charts are quicker than traditional X charts in detecting small to moderate shifts in a process. In this article, we propose a joint statistical design of adaptive X and R charts having all design parameters varying adaptively. The process is subjected to two independent assignable causes. One cause changes the process mean and the other changes the process variance. However, the occurrence of one kind of assignable cause does not preclude the occurrence of the other. It is assumed that the quality characteristic is normally distributed and the time that the process remains in control has exponential distribution. Performance measures of these adaptive control charts are obtained through a Markov chain approach. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Recent theoretical studies have shown that the X̄ chart with variable sampling intervals (VSI) and the X̄ chart with variable sample size (VSS) are quicker than the traditional X̄ chart in detecting shifts in the process. This article considers the X̄ chart with variable sample size and sampling intervals (VSSI). It is assumed that the amount of time the process remains in control has exponential distribution. The properties of the VSSI X̄ chart are obtained using Markov chains. The VSSI X̄ chart is even quicker than the VSI or VSS X̄ charts in detecting moderate shifts in the process.
Resumo:
Recent studies have shown that the X̄ chart with variable sampling intervals (VSI) and/or with variable sample sizes (VSS) detects process shifts faster than the traditional X̄ chart. This article extends these studies for processes that are monitored by both the X̄ and R charts. A Markov chain model is used to determine the properties of the joint X and R charts with variable sample sizes and sampling intervals (VSSI). The VSSI scheme improves the joint X̄ and R control chart performance in terms of the speed with which shifts in the process mean and/or variance are detected.
Resumo:
The biggest advantage of plasma immersion ion implantation (PIII) is the capability of treating objects with irregular geometry without complex manipulation of the target holder. The effectiveness of this approach relies on the uniformity of the incident ion dose. Unfortunately, perfect dose uniformity is usually difficult to achieve when treating samples of complex shape. The problems arise from the non-uniform plasma density and expansion of plasma sheath. A particle-in-cell computer simulation is used to study the time-dependent evolution of the plasma sheath surrounding two-dimensional objects during process of plasma immersion ion implantation. Before starting the implantation phase, steady-state nitrogen plasma is established inside the simulation volume by using ionization of gas precursor with primary electrons. The plasma self-consistently evolves to a non-uniform density distribution, which is used as initial density distribution for the implantation phase. As a result, we can obtain a more realistic description of the plasma sheath expansion and dynamics. Ion current density on the target, average impact energy, and trajectories of the implanted ions were calculated for three geometrical shapes. Large deviations from the uniform dose distribution have been observed for targets with irregular shapes. In addition, effect of secondary electron emission has been included in our simulation and no qualitative modifications to the sheath dynamics have been noticed. However, the energetic secondary electrons change drastically the plasma net balance and also pose significant X-ray hazard. Finally, an axial magnetic field has been added to the calculations and the possibility for magnetic insulation of secondary electrons has been proven.
Resumo:
Regulatory authorities in many countries, in order to maintain an acceptable balance between appropriate customer service qualities and costs, are introducing a performance-based regulation. These regulations impose penalties, and in some cases rewards, which introduce a component of financial risk to an electric power utility due to the uncertainty associated with preserving a specific level of system reliability. In Brazil, for instance, one of the reliability indices receiving special attention by the utilities is the Maximum Continuous Interruption Duration per customer (MCID). This paper describes a chronological Monte Carlo simulation approach to evaluate probability distributions of reliability indices, including the MCID, and the corresponding penalties. In order to get the desired efficiency, modern computational techniques are used for modeling (UML -Unified Modeling Language) as well as for programming (Object- Oriented Programming). Case studies on a simple distribution network and on real Brazilian distribution systems are presented and discussed. © Copyright KTH 2006.