931 resultados para Distributions (probability)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The extent to which density-dependent processes regulate natural populations is the subject of an ongoing debate. We contribute evidence to this debate showing that density-dependent processes influence the population dynamics of the ectoparasite Aponomma hydrosauri (Acari: Ixodidae), a tick species that infests reptiles in Australia. The first piece of evidence comes from an unusually long-term dataset on the distribution of ticks among individual hosts. If density-dependent processes are influencing either host mortality or vital rates of the parasite population, and those distributions can be approximated with negative binomial distributions, then general host-parasite models predict that the aggregation coefficient of the parasite distribution will increase with the average intensity of infections. We fit negative binomial distributions to the frequency distributions of ticks on hosts, and find that the estimated aggregation coefficient k increases with increasing average tick density. This pattern indirectly implies that one or more vital rates of the tick population must be changing with increasing tick density, because mortality rates of the tick's main host, the sleepy lizard, Tiliqua rugosa, are unaffected by changes in tick burdens. Our second piece of evidence is a re-analysis of experimental data on the attachment success of individual ticks to lizard hosts using generalized linear modelling. The probability of successful engorgement decreases with increasing numbers of ticks attached to a host. This is direct evidence of a density-dependent process that could lead to an increase in the aggregation coefficient of tick distributions described earlier. The population-scale increase in the aggregation coefficient is indirect evidence of a density-dependent process or processes sufficiently strong to produce a population-wide pattern, and thus also likely to influence population regulation. The direct observation of a density-dependent process is evidence of at least part of the responsible mechanism.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sensitivity of output of a linear operator to its input can be quantified in various ways. In Control Theory, the input is usually interpreted as disturbance and the output is to be minimized in some sense. In stochastic worst-case design settings, the disturbance is considered random with imprecisely known probability distribution. The prior set of probability measures can be chosen so as to quantify how far the disturbance deviates from the white-noise hypothesis of Linear Quadratic Gaussian control. Such deviation can be measured by the minimal Kullback-Leibler informational divergence from the Gaussian distributions with zero mean and scalar covariance matrices. The resulting anisotropy functional is defined for finite power random vectors. Originally, anisotropy was introduced for directionally generic random vectors as the relative entropy of the normalized vector with respect to the uniform distribution on the unit sphere. The associated a-anisotropic norm of a matrix is then its maximum root mean square or average energy gain with respect to finite power or directionally generic inputs whose anisotropy is bounded above by a≥0. We give a systematic comparison of the anisotropy functionals and the associated norms. These are considered for unboundedly growing fragments of homogeneous Gaussian random fields on multidimensional integer lattice to yield mean anisotropy. Correspondingly, the anisotropic norms of finite matrices are extended to bounded linear translation invariant operators over such fields.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O Teorema Central do Limite e a Lei dos Grandes Números estão entre os mais importantes resultados da teoria da probabilidade. O primeiro deles busca condições sob as quais [fórmula] converge em distribuição para a distribuição normal com parâmetros 0 e 1, quando n tende ao infinito, onde Sn é a soma de n variáveis aleatórias independentes. Ao mesmo tempo, o segundo estabelece condições para que [fórmula] convirja a zero, ou equivalentemente, para que [fórmula] convirja para a esperança das variáveis aleatórias, caso elas sejam identicamente distribuídas. Em ambos os casos as sequências abordadas são do tipo [fórmula], onde [fórmula] e [fórmula] são constantes reais. Caracterizar os possíveis limites de tais sequências é um dos objetivos dessa dissertação, já que elas não convergem exclusivamente para uma variável aleatória degenerada ou com distribuição normal como na Lei dos Grandes Números e no Teorema Central do Limite, respectivamente. Assim, somos levados naturalmente ao estudo das distribuições infinitamente divisíveis e estáveis, e os respectivos teoremas limites, e este vem a ser o objetivo principal desta dissertação. Para as demonstrações dos teoremas utiliza-se como estratégia principal a aplicação do método de Lyapunov, o qual consiste na análise da convergência da sequência de funções características correspondentes às variáveis aleatórias. Nesse sentido, faremos também uma abordagem detalhada de tais funções neste trabalho.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We calculate the equilibrium thermodynamic properties, percolation threshold, and cluster distribution functions for a model of associating colloids, which consists of hard spherical particles having on their surfaces three short-ranged attractive sites (sticky spots) of two different types, A and B. The thermodynamic properties are calculated using Wertheim's perturbation theory of associating fluids. This also allows us to find the onset of self-assembly, which can be quantified by the maxima of the specific heat at constant volume. The percolation threshold is derived, under the no-loop assumption, for the correlated bond model: In all cases it is two percolated phases that become identical at a critical point, when one exists. Finally, the cluster size distributions are calculated by mapping the model onto an effective model, characterized by a-state-dependent-functionality (f) over bar and unique bonding probability (p) over bar. The mapping is based on the asymptotic limit of the cluster distributions functions of the generic model and the effective parameters are defined through the requirement that the equilibrium cluster distributions of the true and effective models have the same number-averaged and weight-averaged sizes at all densities and temperatures. We also study the model numerically in the case where BB interactions are missing. In this limit, AB bonds either provide branching between A-chains (Y-junctions) if epsilon(AB)/epsilon(AA) is small, or drive the formation of a hyperbranched polymer if epsilon(AB)/epsilon(AA) is large. We find that the theoretical predictions describe quite accurately the numerical data, especially in the region where Y-junctions are present. There is fairly good agreement between theoretical and numerical results both for the thermodynamic (number of bonds and phase coexistence) and the connectivity properties of the model (cluster size distributions and percolation locus).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an investigation into cloud-to-ground lightning activity over the continental territory of Portugal with data collected by the national Lightning Location System. The Lightning Location System in Portugal is first presented. Analyses about geographical, seasonal, and polarity distribution of cloud-to-ground lightning activity and cumulative probability of peak current are carried out. An overall ground flash density map is constructed from the database, which contains the information of more than five years and almost four million records. This map is compared with the thunderstorm days map, produced by the Portuguese Institute of Meteorology, and with the orographic map of Portugal. Finally, conclusions are duly drawn.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the main arguments in favour of the adoption and convergence with the international accounting standards published by the IASB (i.e. IAS/IFRS) is that these will allow comparability of financial reporting across countries. However, because these standards use verbal probability expressions (v.g. “probable”) when establishing the recognition and disclosure criteria for accounting elements, they require professional accountants to interpret and classify the probability of an outcome or event taking into account those terms and expressions and to best decide in terms of financial reporting. This paper reports part of a research we carried out on the interpretation of “in context” verbal probability expressions used in the IAS/IFRS by the auditors registered with the Portuguese Securities Market Commission, the Comissão do Mercado de Valores Mobiliários (CMVM). Our results provide support for the hypothesis that culture affects the CMVM registered auditors’ interpretation of verbal probability expressions through its influence on the accounting value (or attitude) of conservatism. Our results also suggest that there are significant differences in their interpretation of the term “probable”, which is consistent with literature in general. Since “probable” is the most frequent verbal probability expression used in the IAS/IFRS, this may have a negative impact on financial statements comparability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação apresentada ao Instituto Politécnico do Porto para obtenção do Grau de Mestre em Logística Orientada por: Prof. Dr. Pedro Godinho

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper proposes a methodology to increase the probability of delivering power to any load point by identifying new investments in distribution energy systems. The proposed methodology is based on statistical failure and repair data of distribution components and it uses a fuzzy-probabilistic modeling for the components outage parameters. The fuzzy membership functions of the outage parameters of each component are based on statistical records. A mixed integer nonlinear programming optimization model is developed in order to identify the adequate investments in distribution energy system components which allow increasing the probability of delivering power to any customer in the distribution system at the minimum possible cost for the system operator. To illustrate the application of the proposed methodology, the paper includes a case study that considers a 180 bus distribution network.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Distributed generation unlike centralized electrical generation aims to generate electrical energy on small scale as near as possible to load centers, interchanging electric power with the network. This work presents a probabilistic methodology conceived to assist the electric system planning engineers in the selection of the distributed generation location, taking into account the hourly load changes or the daily load cycle. The hourly load centers, for each of the different hourly load scenarios, are calculated deterministically. These location points, properly weighted according to their load magnitude, are used to calculate the best fit probability distribution. This distribution is used to determine the maximum likelihood perimeter of the area where each source distributed generation point should preferably be located by the planning engineers. This takes into account, for example, the availability and the cost of the land lots, which are factors of special relevance in urban areas, as well as several obstacles important for the final selection of the candidates of the distributed generation points. The proposed methodology has been applied to a real case, assuming three different bivariate probability distributions: the Gaussian distribution, a bivariate version of Freund’s exponential distribution and the Weibull probability distribution. The methodology algorithm has been programmed in MATLAB. Results are presented and discussed for the application of the methodology to a realistic case and demonstrate the ability of the proposed methodology for efficiently handling the determination of the best location of the distributed generation and their corresponding distribution networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Copyright © 2014 The Authors. Oikos © 2014 Nordic Society Oikos.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The species abundance distribution (SAD) has been a central focus of community ecology for over fifty years, and is currently the subject of widespread renewed interest. The gambin model has recently been proposed as a model that provides a superior fit to commonly preferred SAD models. It has also been argued that the model's single parameter (α) presents a potentially informative ecological diversity metric, because it summarises the shape of the SAD in a single number. Despite this potential, few empirical tests of the model have been undertaken, perhaps because the necessary methods and software for fitting the model have not existed. Here, we derive a maximum likelihood method to fit the model, and use it to undertake a comprehensive comparative analysis of the fit of the gambin model. The functions and computational code to fit the model are incorporated in a newly developed free-to-download R package (gambin). We test the gambin model using a variety of datasets and compare the fit of the gambin model to fits obtained using the Poisson lognormal, logseries and zero-sum multinomial distributions. We found that gambin almost universally provided a better fit to the data and that the fit was consistent for a variety of sample grain sizes. We demonstrate how α can be used to differentiate intelligibly between community structures of Azorean arthropods sampled in different land use types. We conclude that gambin presents a flexible model capable of fitting a wide variety of observed SAD data, while providing a useful index of SAD form in its single fitted parameter. As such, gambin has wide potential applicability in the study of SADs, and ecology more generally.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Lei de Potência é uma particularidade de um sistema não linear, revelando um sistema complexo próximo da auto-organização. Algumas características de sistemas naturais e artificiais, tais como dimensão populacional das cidades, valor dos rendimentos pessoais, frequência de ocorrência de palavras em textos e magnitude de sismos, seguem distribuições do tipo Lei de Potência. Estas distribuições indicam que pequenas ocorrências são muito comuns e grandes ocorrências são raras, podendo porém verificar-se com razoável probabilidade. A finalidade deste trabalho visa a identificação de fenómenos associados às Leis de Potência. Mostra-se o comportamento típico destes fenómenos, com os dados retirados dos vários casos de estudo e com a ajuda de uma meta-análise. As Leis de Potência em sistemas naturais e artificiais apresentam uma proximidade a um padrão, quando os valores são normalizados (frequências relativas) para dar origem a um meta-gráfico.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As polycyclic aromatic hydrocarbons (PAHs) have a negative impact on human health due to their mutagenic and/or carcinogenic properties, the objective of this work was to study the influence of tobacco smoke on levels and phase distribution of PAHs and to evaluate the associated health risks. The air samples were collected at two homes; 18 PAHs (the 16 PAHs considered by U.S. EPA as priority pollutants, dibenzo[a,l]pyrene and benzo[j]fluoranthene) were determined in gas phase and associated with thoracic (PM10) and respirable (PM2.5) particles. At home influenced by tobacco smoke the total concentrations of 18 PAHs in air ranged from 28.3 to 106 ngm 3 (mean of 66.7 25.4 ngm 3),∑PAHs being 95% higher than at the non-smoking one where the values ranged from 17.9 to 62.0 ngm 3 (mean of 34.5 16.5 ngm 3). On average 74% and 78% of ∑PAHs were present in gas phase at the smoking and non-smoking homes, respectively, demonstrating that adequate assessment of PAHs in air requires evaluation of PAHs in both gas and particulate phases. When influenced by tobacco smoke the health risks values were 3.5e3.6 times higher due to the exposure of PM10. The values of lifetime lung cancer risks were 4.1 10 3 and 1.7 10 3 for the smoking and nonsmoking homes, considerably exceeding the health-based guideline level at both homes also due to the contribution of outdoor traffic emissions. The results showed that evaluation of benzo[a]pyrene alone would probably underestimate the carcinogenic potential of the studied PAH mixtures; in total ten carcinogenic PAHs represented 36% and 32% of the gaseous ∑PAHs and in particulate phase they accounted for 75% and 71% of ∑PAHs at the smoking and non-smoking homes, respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern real-time systems, with a more flexible and adaptive nature, demand approaches for timeliness evaluation based on probabilistic measures of meeting deadlines. In this context, simulation can emerge as an adequate solution to understand and analyze the timing behaviour of actual systems. However, care must be taken with the obtained outputs under the penalty of obtaining results with lack of credibility. Particularly important is to consider that we are more interested in values from the tail of a probability distribution (near worst-case probabilities), instead of deriving confidence on mean values. We approach this subject by considering the random nature of simulation output data. We will start by discussing well known approaches for estimating distributions out of simulation output, and the confidence which can be applied to its mean values. This is the basis for a discussion on the applicability of such approaches to derive confidence on the tail of distributions, where the worst-case is expected to be.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim - To use Monte Carlo (MC) together with voxel phantoms to analyze the tissue heterogeneity effect in the dose distributions and equivalent uniform dose (EUD) for (125)I prostate implants. Background - Dose distribution calculations in low dose-rate brachytherapy are based on the dose deposition around a single source in a water phantom. This formalism does not take into account tissue heterogeneities, interseed attenuation, or finite patient dimensions effects. Tissue composition is especially important due to the photoelectric effect. Materials and Methods - The computed tomographies (CT) of two patients with prostate cancer were used to create voxel phantoms for the MC simulations. An elemental composition and density were assigned to each structure. Densities of the prostate, vesicles, rectum and bladder were determined through the CT electronic densities of 100 patients. The same simulations were performed considering the same phantom as pure water. Results were compared via dose-volume histograms and EUD for the prostate and rectum. Results - The mean absorbed doses presented deviations of 3.3-4.0% for the prostate and of 2.3-4.9% for the rectum, when comparing calculations in water with calculations in the heterogeneous phantom. In the calculations in water, the prostate D 90 was overestimated by 2.8-3.9% and the rectum D 0.1cc resulted in dose differences of 6-8%. The EUD resulted in an overestimation of 3.5-3.7% for the prostate and of 7.7-8.3% for the rectum. Conclusions - The deposited dose was consistently overestimated for the simulation in water. In order to increase the accuracy in the determination of dose distributions, especially around the rectum, the introduction of the model-based algorithms is recommended.