964 resultados para Distribution (Probability theory)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modeling physiological processes using tracer kinetic methods requires knowledge of the time course of the tracer concentration in blood supplying the organ. For liver studies, however, inaccessibility of the portal vein makes direct measurement of the hepatic dual-input function impossible in humans. We want to develop a method to predict the portal venous time-activity curve from measurements of an arterial time-activity curve. An impulse-response function based on a continuous distribution of washout constants is developed and validated for the gut. Experiments with simultaneous blood sampling in aorta and portal vein were made in 13 anesthetized pigs following inhalation of intravascular [O-15] CO or injections of diffusible 3-O[ C-11] methylglucose (MG). The parameters of the impulse-response function have a physiological interpretation in terms of the distribution of washout constants and are mathematically equivalent to the mean transit time ( T) and standard deviation of transit times. The results include estimates of mean transit times from the aorta to the portal vein in pigs: (T) over bar = 0.35 +/- 0.05 min for CO and 1.7 +/- 0.1 min for MG. The prediction of the portal venous time-activity curve benefits from constraining the regression fits by parameters estimated independently. This is strong evidence for the physiological relevance of the impulse-response function, which includes asymptotically, and thereby justifies kinetically, a useful and simple power law. Similarity between our parameter estimates in pigs and parameter estimates in normal humans suggests that the proposed model can be adapted for use in humans.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sensitivity of output of a linear operator to its input can be quantified in various ways. In Control Theory, the input is usually interpreted as disturbance and the output is to be minimized in some sense. In stochastic worst-case design settings, the disturbance is considered random with imprecisely known probability distribution. The prior set of probability measures can be chosen so as to quantify how far the disturbance deviates from the white-noise hypothesis of Linear Quadratic Gaussian control. Such deviation can be measured by the minimal Kullback-Leibler informational divergence from the Gaussian distributions with zero mean and scalar covariance matrices. The resulting anisotropy functional is defined for finite power random vectors. Originally, anisotropy was introduced for directionally generic random vectors as the relative entropy of the normalized vector with respect to the uniform distribution on the unit sphere. The associated a-anisotropic norm of a matrix is then its maximum root mean square or average energy gain with respect to finite power or directionally generic inputs whose anisotropy is bounded above by a≥0. We give a systematic comparison of the anisotropy functionals and the associated norms. These are considered for unboundedly growing fragments of homogeneous Gaussian random fields on multidimensional integer lattice to yield mean anisotropy. Correspondingly, the anisotropic norms of finite matrices are extended to bounded linear translation invariant operators over such fields.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new lifetime distribution capable of modeling a bathtub-shaped hazard-rate function is proposed. The proposed model is derived as a limiting case of the Beta Integrated Model and has both the Weibull distribution and Type I extreme value distribution as special cases. The model can be considered as another useful 3-parameter generalization of the Weibull distribution. An advantage of the model is that the model parameters can be estimated easily based on a Weibull probability paper (WPP) plot that serves as a tool for model identification. Model characterization based on the WPP plot is studied. A numerical example is provided and comparison with another Weibull extension, the exponentiated Weibull, is also discussed. The proposed model compares well with other competing models to fit data that exhibits a bathtub-shaped hazard-rate function.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We generalize the Flory-Stockmayer theory of percolation to a model of associating (patchy) colloids, which consists of hard spherical particles, having on their surfaces f short-ranged-attractive sites of m different types. These sites can form bonds between particles and thus promote self-assembly. It is shown that the percolation threshold is given in terms of the eigenvalues of a m x m matrix, which describes the recursive relations for the number of bonded particles on the ith level of a cluster with no loops; percolation occurs when the largest of these eigenvalues equals unity. Expressions for the probability that a particle is not bonded to the giant cluster, for the average cluster size and the average size of a cluster to which a randomly chosen particle belongs, are also derived. Explicit results for these quantities are computed for the case f = 3 and m = 2. We show how these structural properties are related to the thermodynamics of the associating system by regarding bond formation as a (equilibrium) chemical reaction. This solution of the percolation problem, combined with Wertheim's thermodynamic first-order perturbation theory, allows the investigation of the interplay between phase behavior and cluster formation for general models of patchy colloids.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although stock prices fluctuate, the variations are relatively small and are frequently assumed to be normal distributed on a large time scale. But sometimes these fluctuations can become determinant, especially when unforeseen large drops in asset prices are observed that could result in huge losses or even in market crashes. The evidence shows that these events happen far more often than would be expected under the generalized assumption of normal distributed financial returns. Thus it is crucial to properly model the distribution tails so as to be able to predict the frequency and magnitude of extreme stock price returns. In this paper we follow the approach suggested by McNeil and Frey (2000) and combine the GARCH-type models with the Extreme Value Theory (EVT) to estimate the tails of three financial index returns DJI,FTSE 100 and NIKKEI 225 representing three important financial areas in the world. Our results indicate that EVT-based conditional quantile estimates are much more accurate than those from conventional AR-GARCH models assuming normal or Student’s t-distribution innovations when doing out-of-sample estimation (within the insample estimation, this is so for the right tail of the distribution of returns).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação apresentada ao Instituto Politécnico do Porto para obtenção do Grau de Mestre em Logística Orientada por: Prof. Dr. Pedro Godinho

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a methodology that aims to increase the probability of delivering power to any load point of the electrical distribution system by identifying new investments in distribution components. The methodology is based on statistical failure and repair data of the distribution power system components and it uses fuzzy-probabilistic modelling for system component outage parameters. Fuzzy membership functions of system component outage parameters are obtained by statistical records. A mixed integer non-linear optimization technique is developed to identify adequate investments in distribution networks components that allow increasing the availability level for any customer in the distribution system at minimum cost for the system operator. To illustrate the application of the proposed methodology, the paper includes a case study that considers a real distribution network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper proposes a methodology to increase the probability of delivering power to any load point by identifying new investments in distribution energy systems. The proposed methodology is based on statistical failure and repair data of distribution components and it uses a fuzzy-probabilistic modeling for the components outage parameters. The fuzzy membership functions of the outage parameters of each component are based on statistical records. A mixed integer nonlinear programming optimization model is developed in order to identify the adequate investments in distribution energy system components which allow increasing the probability of delivering power to any customer in the distribution system at the minimum possible cost for the system operator. To illustrate the application of the proposed methodology, the paper includes a case study that considers a 180 bus distribution network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper is presented a Game Theory based methodology to allocate transmission costs, considering cooperation and competition between producers. As original contribution, it finds the degree of participation on the additional costs according to the demand behavior. A comparative study was carried out between the obtained results using Nucleolus balance and Shapley Value, with other techniques such as Averages Allocation method and the Generalized Generation Distribution Factors method (GGDF). As example, a six nodes network was used for the simulations. The results demonstrate the ability to find adequate solutions on open access environment to the networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a new dynamical approach to the Blumberg's equation, a family of unimodal maps. These maps are proportional to Beta(p, q) probability densities functions. Using the symmetry of the Beta(p, q) distribution and symbolic dynamics techniques, a new concept of mirror symmetry is defined for this family of maps. The kneading theory is used to analyze the effect of such symmetry in the presented models. The main result proves that two mirror symmetric unimodal maps have the same topological entropy. Different population dynamics regimes are identified, when the intrinsic growth rate is modified: extinctions, stabilities, bifurcations, chaos and Allee effect. To illustrate our results, we present a numerical analysis, where are demonstrated: monotonicity of the topological entropy with the variation of the intrinsic growth rate, existence of isentropic sets in the parameters space and mirror symmetry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Copyright © 2014 Entomological Society of America.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The theory of fractional calculus (FC) is a useful mathematical tool in many applied sciences. Nevertheless, only in the last decades researchers were motivated for the adoption of the FC concepts. There are several reasons for this state of affairs, namely the co-existence of different definitions and interpretations, and the necessity of approximation methods for the real time calculation of fractional derivatives (FDs). In a first part, this paper introduces a probabilistic interpretation of the fractional derivative based on the Grünwald-Letnikov definition. In a second part, the calculation of fractional derivatives through Padé fraction approximations is analyzed. It is observed that the probabilistic interpretation and the frequency response of fraction approximations of FDs reveal a clear correlation between both concepts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we consider a Stackelberg duopoly competition with differentiated goods and with unknown costs. The firms' aim is to choose the output levels of their products according to the well-known concept of perfect Bayesian equilibrium. There is a firm ( F1 ) that chooses first the quantity 1 q of its good; the other firm ( F2 ) observes 1 q and then chooses the quantity 2 q of its good. We suppose that each firm has two different technologies, and uses one of them following a probability distribution. The use of either one or the other technology affects the unitary production cost. We show that there is exactly one perfect Bayesian equilibrium for this game. We analyse the advantages, for firms and for consumers, of using the technology with the highest production cost versus the one with the cheapest cost.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The conclusions of the Bertrand model of competition are substantially altered by the presence of either differentiated goods or asymmetric information about rival’s production costs. In this paper, we consider a Bertrand competition, with differentiated goods. Furthermore, we suppose that each firm has two different technologies, and uses one of them according to a certain probability distribution. The use of either one or the other technology affects the unitary production cost. We show that this game has exactly one Bayesian Nash equilibrium. We do ex-ante and ex-post analyses of firms’ profits and market prices. We prove that the expected profit of each firm increases with the variance of its production costs. We also show that the expected price of each good increases with both expected production costs, being the effect of the expected production costs of the rival dominated by the effect of the own expected production costs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Smart Grids (SGs) have emerged as the new paradigm for power system operation and management, being designed to include large amounts of distributed energy resources. This new paradigm requires new Energy Resource Management (ERM) methodologies considering different operation strategies and the existence of new management players such as several types of aggregators. This paper proposes a methodology to facilitate the coalition between distributed generation units originating Virtual Power Players (VPP) considering a game theory approach. The proposed approach consists in the analysis of the classifications that were attributed by each VPP to the distributed generation units, as well as in the analysis of the previous established contracts by each player. The proposed classification model is based in fourteen parameters including technical, economical and behavioural ones. Depending of the VPP strategies, size and goals, each parameter has different importance. VPP can also manage other type of energy resources, like storage units, electric vehicles, demand response programs or even parts of the MV and LV distribution network. A case study with twelve VPPs with different characteristics and one hundred and fifty real distributed generation units is included in the paper.