895 resultados para Markov chains. Convergence. Evolutionary Strategy. Large Deviations
Resumo:
The biggest advantage of plasma immersion ion implantation (PIII) is the capability of treating objects with irregular geometry without complex manipulation of the target holder. The effectiveness of this approach relies on the uniformity of the incident ion dose. Unfortunately, perfect dose uniformity is usually difficult to achieve when treating samples of complex shape. The problems arise from the non-uniform plasma density and expansion of plasma sheath. A particle-in-cell computer simulation is used to study the time-dependent evolution of the plasma sheath surrounding two-dimensional objects during process of plasma immersion ion implantation. Before starting the implantation phase, steady-state nitrogen plasma is established inside the simulation volume by using ionization of gas precursor with primary electrons. The plasma self-consistently evolves to a non-uniform density distribution, which is used as initial density distribution for the implantation phase. As a result, we can obtain a more realistic description of the plasma sheath expansion and dynamics. Ion current density on the target, average impact energy, and trajectories of the implanted ions were calculated for three geometrical shapes. Large deviations from the uniform dose distribution have been observed for targets with irregular shapes. In addition, effect of secondary electron emission has been included in our simulation and no qualitative modifications to the sheath dynamics have been noticed. However, the energetic secondary electrons change drastically the plasma net balance and also pose significant X-ray hazard. Finally, an axial magnetic field has been added to the calculations and the possibility for magnetic insulation of secondary electrons has been proven.
Resumo:
Regulatory authorities in many countries, in order to maintain an acceptable balance between appropriate customer service qualities and costs, are introducing a performance-based regulation. These regulations impose penalties, and in some cases rewards, which introduce a component of financial risk to an electric power utility due to the uncertainty associated with preserving a specific level of system reliability. In Brazil, for instance, one of the reliability indices receiving special attention by the utilities is the Maximum Continuous Interruption Duration per customer (MCID). This paper describes a chronological Monte Carlo simulation approach to evaluate probability distributions of reliability indices, including the MCID, and the corresponding penalties. In order to get the desired efficiency, modern computational techniques are used for modeling (UML -Unified Modeling Language) as well as for programming (Object- Oriented Programming). Case studies on a simple distribution network and on real Brazilian distribution systems are presented and discussed. © Copyright KTH 2006.
Resumo:
In this paper, a novel methodology to price the reactive power support ancillary service of Distributed Generators (DGs) with primary energy source uncertainty is shown. The proposed methodology provides the service pricing based on the Loss of Opportunity Costs (LOC) calculation. An algorithm is proposed to reduce the uncertainty present in these generators using Multiobjective Power Flows (MOPFs) implemented in multiple probabilistic scenarios through Monte Carlo Simulations (MCS), and modeling the time series associated with the generation of active power from DGs through Markov Chains (MC). © 2011 IEEE.
Resumo:
Distributed Generation, microgrid technologies, two-way communication systems, and demand response programs are issues that are being studied in recent years within the concept of smart grids. At some level of enough penetration, the Distributed Generators (DGs) can provide benefits for sub-transmission and transmission systems through the so-called ancillary services. This work is focused on the ancillary service of reactive power support provided by DGs, specifically Wind Turbine Generators (WTGs), with high level of impact on transmission systems. The main objective of this work is to propose an optimization methodology to price this service by determining the costs in which a DG incurs when it loses sales opportunity of active power, i.e, by determining the Loss of Opportunity Costs (LOC). LOC occur when more reactive power is required than available, and the active power generation has to be reduced in order to increase the reactive power capacity. In the optimization process, three objectives are considered: active power generation costs of DGs, voltage stability margin of the system, and losses in the lines of the network. Uncertainties of WTGs are reduced solving multi-objective optimal power flows in multiple probabilistic scenarios constructed by Monte Carlo simulations, and modeling the time series associated with the active power generation of each WTG via Fuzzy Logic and Markov Chains. The proposed methodology was tested using the IEEE 14 bus test system with two WTGs installed. © 2011 IEEE.
Resumo:
We investigate the possibilities of New Physics affecting the Standard Model (SM) Higgs sector. An effective Lagrangian with dimension-six operators is used to capture the effect of New Physics. We carry out a global Bayesian inference analysis, considering the recent LHC data set including all available correlations, as well as results from Tevatron. Trilinear gauge boson couplings and electroweak precision observables are also taken into account. The case of weak bosons tensorial couplings is closely examined and NLO QCD corrections are taken into account in the deviations we predict. We consider two scenarios, one where the coefficients of all the dimension-six operators are essentially unconstrained, and one where a certain subset is loop suppressed. In both scenarios, we find that large deviations from some of the SM Higgs couplings can still be present, assuming New Physics arising at 3 TeV. In particular, we find that a significantly reduced coupling of the Higgs to the top quark is possible and slightly favored by searches on Higgs production in association with top quark pairs. The total width of the Higgs boson is only weakly constrained and can vary between 0.7 and 2.7 times the Standard Model value within 95% Bayesian credible interval (BCI). We also observe sizeable effects induced by New Physics contributions to tensorial couplings. In particular, the Higgs boson decay width into Zγ can be enhanced by up to a factor 12 within 95% BCI. © 2013 SISSA.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Pós-graduação em Biometria - IBB
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
We study the potential effects of anomalous couplings of the third generation quarks to gauge bosons in rare B decays. We focus on the constraints from flavor changing neutral current processes such as b→sγ and b →sl+l-. We consider both dimension-four and dimension-five operators and show that the latter can give large deviations from the standard model in the still unobserved dilepton modes, even after the bounds from b→sγ and precision electroweak observables are taken into account. ©2000 The American Physical Society.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
The starting point of this article is the question "How to retrieve fingerprints of rhythm in written texts?" We address this problem in the case of Brazilian and European Portuguese. These two dialects of Modern Portuguese share the same lexicon and most of the sentences they produce are superficially identical. Yet they are conjectured, on linguistic grounds, to implement different rhythms. We show that this linguistic question can be formulated as a problem of model selection in the class of variable length Markov chains. To carry on this approach, we compare texts from European and Brazilian Portuguese. These texts are previously encoded according to some basic rhythmic features of the sentences which can be automatically retrieved. This is an entirely new approach from the linguistic point of view. Our statistical contribution is the introduction of the smallest maximizer criterion which is a constant free procedure for model selection. As a by-product, this provides a solution for the problem of optimal choice of the penalty constant when using the BIC to select a variable length Markov chain. Besides proving the consistency of the smallest maximizer criterion when the sample size diverges, we also make a simulation study comparing our approach with both the standard BIC selection and the Peres-Shields order estimation. Applied to the linguistic sample constituted for our case study, the smallest maximizer criterion assigns different context-tree models to the two dialects of Portuguese. The features of the selected models are compatible with current conjectures discussed in the linguistic literature.
Resumo:
Decision tree induction algorithms represent one of the most popular techniques for dealing with classification problems. However, traditional decision-tree induction algorithms implement a greedy approach for node splitting that is inherently susceptible to local optima convergence. Evolutionary algorithms can avoid the problems associated with a greedy search and have been successfully employed to the induction of decision trees. Previously, we proposed a lexicographic multi-objective genetic algorithm for decision-tree induction, named LEGAL-Tree. In this work, we propose extending this approach substantially, particularly w.r.t. two important evolutionary aspects: the initialization of the population and the fitness function. We carry out a comprehensive set of experiments to validate our extended algorithm. The experimental results suggest that it is able to outperform both traditional algorithms for decision-tree induction and another evolutionary algorithm in a variety of application domains.
Resumo:
Non-Equilibrium Statistical Mechanics is a broad subject. Grossly speaking, it deals with systems which have not yet relaxed to an equilibrium state, or else with systems which are in a steady non-equilibrium state, or with more general situations. They are characterized by external forcing and internal fluxes, resulting in a net production of entropy which quantifies dissipation and the extent by which, by the Second Law of Thermodynamics, time-reversal invariance is broken. In this thesis we discuss some of the mathematical structures involved with generic discrete-state-space non-equilibrium systems, that we depict with networks in all analogous to electrical networks. We define suitable observables and derive their linear regime relationships, we discuss a duality between external and internal observables that reverses the role of the system and of the environment, we show that network observables serve as constraints for a derivation of the minimum entropy production principle. We dwell on deep combinatorial aspects regarding linear response determinants, which are related to spanning tree polynomials in graph theory, and we give a geometrical interpretation of observables in terms of Wilson loops of a connection and gauge degrees of freedom. We specialize the formalism to continuous-time Markov chains, we give a physical interpretation for observables in terms of locally detailed balanced rates, we prove many variants of the fluctuation theorem, and show that a well-known expression for the entropy production due to Schnakenberg descends from considerations of gauge invariance, where the gauge symmetry is related to the freedom in the choice of a prior probability distribution. As an additional topic of geometrical flavor related to continuous-time Markov chains, we discuss the Fisher-Rao geometry of nonequilibrium decay modes, showing that the Fisher matrix contains information about many aspects of non-equilibrium behavior, including non-equilibrium phase transitions and superposition of modes. We establish a sort of statistical equivalence principle and discuss the behavior of the Fisher matrix under time-reversal. To conclude, we propose that geometry and combinatorics might greatly increase our understanding of nonequilibrium phenomena.