991 resultados para Random variables


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The paper investigates which of Shannon’s measures (entropy, conditional entropy, mutual information) is the right one for the task of quantifying information flow in a programming language. We examine earlier relevant contributions from Denning, McLean and Gray and we propose and motivate a specific quantitative definition of information flow. We prove results relating equivalence relations, interference of program variables, independence of random variables and the flow of confidential information. Finally, we show how, in our setting, Shannon’s Perfect Secrecy theorem provides a sufficient condition to determine whether a program leaks confidential information.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Climate change has resulted in substantial variations in annual extreme rainfall quantiles in different durations and return periods. Predicting the future changes in extreme rainfall quantiles is essential for various water resources design, assessment, and decision making purposes. Current Predictions of future rainfall extremes, however, exhibit large uncertainties. According to extreme value theory, rainfall extremes are rather random variables, with changing distributions around different return periods; therefore there are uncertainties even under current climate conditions. Regarding future condition, our large-scale knowledge is obtained using global climate models, forced with certain emission scenarios. There are widely known deficiencies with climate models, particularly with respect to precipitation projections. There is also recognition of the limitations of emission scenarios in representing the future global change. Apart from these large-scale uncertainties, the downscaling methods also add uncertainty into estimates of future extreme rainfall when they convert the larger-scale projections into local scale. The aim of this research is to address these uncertainties in future projections of extreme rainfall of different durations and return periods. We plugged 3 emission scenarios with 2 global climate models and used LARS-WG, a well-known weather generator, to stochastically downscale daily climate models’ projections for the city of Saskatoon, Canada, by 2100. The downscaled projections were further disaggregated into hourly resolution using our new stochastic and non-parametric rainfall disaggregator. The extreme rainfall quantiles can be consequently identified for different durations (1-hour, 2-hour, 4-hour, 6-hour, 12-hour, 18-hour and 24-hour) and return periods (2-year, 10-year, 25-year, 50-year, 100-year) using Generalized Extreme Value (GEV) distribution. By providing multiple realizations of future rainfall, we attempt to measure the extent of total predictive uncertainty, which is contributed by climate models, emission scenarios, and downscaling/disaggregation procedures. The results show different proportions of these contributors in different durations and return periods.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We apply the concept of exchangeable random variables to the case of non-additive robability distributions exhibiting ncertainty aversion, and in the lass generated bya convex core convex non-additive probabilities, ith a convex core). We are able to rove two versions of the law of arge numbers (de Finetti's heorems). By making use of two efinitions. of independence we rove two versions of the strong law f large numbers. It turns out that e cannot assure the convergence of he sample averages to a constant. e then modal the case there is a true" probability distribution ehind the successive realizations of the uncertain random variable. In this case convergence occurs. This result is important because it renders true the intuition that it is possible "to learn" the "true" additive distribution behind an uncertain event if one repeatedly observes it (a sufficiently large number of times). We also provide a conjecture regarding the "Iearning" (or updating) process above, and prove a partia I result for the case of Dempster-Shafer updating rule and binomial trials.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Bounds on the distribution function of the sum of two random variables with known marginal distributions obtained by Makarov (1981) can be used to bound the cumulative distribution function (c.d.f.) of individual treatment effects. Identification of the distribution of individual treatment effects is important for policy purposes if we are interested in functionals of that distribution, such as the proportion of individuals who gain from the treatment and the expected gain from the treatment for these individuals. Makarov bounds on the c.d.f. of the individual treatment effect distribution are pointwise sharp, i.e. they cannot be improved in any single point of the distribution. We show that the Makarov bounds are not uniformly sharp. Specifically, we show that the Makarov bounds on the region that contains the c.d.f. of the treatment effect distribution in two (or more) points can be improved, and we derive the smallest set for the c.d.f. of the treatment effect distribution in two (or more) points. An implication is that the Makarov bounds on a functional of the c.d.f. of the individual treatment effect distribution are not best possible.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The lava Platform is increasing1y being adopted in the development of distributed sys¬tems with higb user demando This kind of application is more complex because it needs beyond attending the functional requirements, to fulfil1 the pre-established performance parameters. This work makes a study on the Java Vutual Machine (JVM), approaching its intemal aspects and exploring the garbage collection strategies existing in the literature and used by the NM. It also presents a set of tools that helps in the job of optimizing applications and others that help in the monitoring of applications in the production envi¬ronment. Doe to the great amount of technologies that aim to solve problems which are common to the application layer, it becomes difficult to choose the one with best time response and less memory usage. This work presents a brief introduction to each one of tbe possible technologies and realize comparative tests through a statistical analysis of the response time and garbage collection activity random variables. The obtained results supply engineers and managers with a subside to decide which technologies to use in large applications through the knowledge of how they behave in their environments and the amount of resources that they consume. The relation between the productivity of the technology and its performance is also considered ao important factor in this choice

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Following the new tendency of interdisciplinarity of modern science, a new field called neuroengineering has come to light in the last decades. After 2000, scientific journals and conferences all around the world have been created on this theme. The present work comprises three different subareas related to neuroengineering and electrical engineering: neural stimulation; theoretical and computational neuroscience; and neuronal signal processing; as well as biomedical engineering. The research can be divided in three parts: (i) A new method of neuronal photostimulation was developed based on the use of caged compounds. Using the inhibitory neurotransmitter GABA caged by a ruthenium complex it was possible to block neuronal population activity using a laser pulse. The obtained results were evaluated by Wavelet analysis and tested by non-parametric statistics. (ii) A mathematical method was created to identify neuronal assemblies. Neuronal assemblies were proposed as the basis of learning by Donald Hebb remain the most accepted theory for neuronal representation of external stimuli. Using the Marcenko-Pastur law of eigenvalue distribution it was possible to detect neuronal assemblies and to compute their activity with high temporal resolution. The application of the method in real electrophysiological data revealed that neurons from the neocortex and hippocampus can be part of the same assembly, and that neurons can participate in multiple assemblies. (iii) A new method of automatic classification of heart beats was developed, which does not rely on a data base for training and is not specialized in specific pathologies. The method is based on Wavelet decomposition and normality measures of random variables. Throughout, the results presented in the three fields of knowledge represent qualification in neural and biomedical engineering

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this work, the paper of Campos and Dorea [3] was detailed. In that article a Kernel Estimator was applied to a sequence of random variables with general state space, which were independent and identicaly distributed. In chapter 2, the estimator´s properties such as asymptotic unbiasedness, consistency in quadratic mean, strong consistency and asymptotic normality were verified. In chapter 3, using R software, numerical experiments were developed in order to give a visual idea of the estimate process

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The aim of this work is to test an algorithm to estimate, in real time, the attitude of an artificial satellite using real data supplied by attitude sensors that are on board of the CBERS-2 satellite (China Brazil Earth Resources Satellite). The real-time estimator used in this work for attitude determination is the Unscented Kalman Filter. This filter is a new alternative to the extended Kalman filter usually applied to the estimation and control problems of attitude and orbit. This algorithm is capable of carrying out estimation of the states of nonlinear systems, without the necessity of linearization of the nonlinear functions present in the model. This estimation is possible due to a transformation that generates a set of vectors that, suffering a nonlinear transformation, preserves the same mean and covariance of the random variables before the transformation. The performance will be evaluated and analyzed through the comparison between the Unscented Kalman filter and the extended Kalman filter results, by using real onboard data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper an efficient algorithm for probabilistic analysis of unbalanced three-phase weakly-meshed distribution systems is presented. This algorithm uses the technique of Two-Point Estimate Method for calculating the probabilistic behavior of the system random variables. Additionally, the deterministic analysis of the state variables is performed by means of a Compensation-Based Radial Load Flow (CBRLF). Such load flow efficiently exploits the topological characteristics of the network. To deal with distributed generation, a strategy to incorporate a simplified model of a generator in the CBRLF is proposed. Thus, depending on the type of control and generator operation conditions, the node with distributed generation can be modeled either as a PV or PQ node. To validate the efficiency of the proposed algorithm, the IEEE 37 bus test system is used. The probabilistic results are compared with those obtained using the Monte Carlo method.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The study of the association between two random variables that have a joint normal distribution is of interest in applied statistics; for example, in statistical genetics. This article, targeted to applied statisticians, addresses inferences about the coefficient of correlation (ρ) in the bivariate normal and standard bivariate normal distributions using likelihood, frequentist, and Baycsian perspectives. Some results are surprising. For instance, the maximum likelihood estimator and the posterior distribution of ρ in the standard bivariate normal distribution do not follow directly from results for a general bivariate normal distribution. An example employing bootstrap and rejection sampling procedures is used to illustrate some of the peculiarities.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper a framework based on the decomposition of the first-order optimality conditions is described and applied to solve the Probabilistic Power Flow (PPF) problem in a coordinated but decentralized way in the context of multi-area power systems. The purpose of the decomposition framework is to solve the problem through a process of solving smaller subproblems, associated with each area of the power system, iteratively. This strategy allows the probabilistic analysis of the variables of interest, in a particular area, without explicit knowledge of network data of the other interconnected areas, being only necessary to exchange border information related to the tie-lines between areas. An efficient method for probabilistic analysis, considering uncertainty in n system loads, is applied. The proposal is to use a particular case of the point estimate method, known as Two-Point Estimate Method (TPM), rather than the traditional approach based on Monte Carlo simulation. The main feature of the TPM is that it only requires resolve 2n power flows for to obtain the behavior of any random variable. An iterative coordination algorithm between areas is also presented. This algorithm solves the Multi-Area PPF problem in a decentralized way, ensures the independent operation of each area and integrates the decomposition framework and the TPM appropriately. The IEEE RTS-96 system is used in order to show the operation and effectiveness of the proposed approach and the Monte Carlo simulations are used to validation of the results. © 2011 IEEE.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pós-graduação em Engenharia Mecânica - FEG

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, the optimal reactive power planning problem under risk is presented. The classical mixed-integer nonlinear model for reactive power planning is expanded into two stage stochastic model considering risk. This new model considers uncertainty on the demand load. The risk is quantified by a factor introduced into the objective function and is identified as the variance of the random variables. Finally numerical results illustrate the performance of the proposed model, that is applied to IEEE 30-bus test system to determine optimal amount and location for reactive power expansion.