953 resultados para Geometric Distributions


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we present a hybrid technique for correcting distortions that appear when projecting images onto geometrically complex, colored and textured surfaces. It analyzes the optical flow that results from perspective distortions during motions of the observer and tries to use this information for computing the correct image warping. If this fails due to an unreliable optical flow, an accurate -but slower and visiblestructured light projection is automatically triggered. Together with an appropriate radiometric compensation, view-dependent content can be projected onto arbitrary everyday surfaces. An implementation mainly on the GPU ensures fast frame rates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Inventory policies provide instructions on how to calculate stock levels and order quantities. This paper examines the effect of the demand distribution on the performance of several well-known policies. Their performance is compared in terms of achieved service levels. As a conclusion from these comparisons, a nonparametric demand model is proposed to be used in inventory policies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Truncated distributions of the exponential family have great influence in the simulation models. This paper discusses the truncated Weibull distribution specifically. The truncation of the distribution is achieved by the Maximum Likelihood Estimation method or combined with the expectation and variance expressions. After the fitting of distribution, the goodness-of-fit tests (the Chi-Square test and the Kolmogorov-Smirnov test) are executed to rule out the rejected hypotheses. Finally the distributions are integrated in various simulation models, e. g. shipment consolidation model, to compare the influence of truncated and original versions of Weibull distribution on the model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The importance of competition between similar species in driving community assembly is much debated. Recently, phylogenetic patterns in species composition have been investigated to help resolve this question: phylogenetic clustering is taken to imply environmental filtering, and phylogenetic overdispersion to indicate limiting similarity between species. We used experimental plant communities with random species compositions and initially even abundance distributions to examine the development of phylogenetic pattern in species abundance distributions. Where composition was held constant by weeding, abundance distributions became overdispersed through time, but only in communities that contained distantly related clades, some with several species (i.e., a mix of closely and distantly related species). Phylogenetic pattern in composition therefore constrained the development of overdispersed abundance distributions, and this might indicate limiting similarity between close relatives and facilitation/complementarity between distant relatives. Comparing the phylogenetic patterns in these communities with those expected from the monoculture abundances of the constituent species revealed that interspecific competition caused the phylogenetic patterns. Opening experimental communities to colonization by all species in the species pool led to convergence in phylogenetic diversity. At convergence, communities were composed of several distantly related but species-rich clades and had overdispersed abundance distributions. This suggests that limiting similarity processes determine which species dominate a community but not which species occur in a community. Crucially, as our study was carried out in experimental communities, we could rule out local evolutionary or dispersal explanations for the patterns and identify ecological processes as the driving force, underlining the advantages of studying these processes in experimental communities. Our results show that phylogenetic relations between species provide a good guide to understanding community structure and add a new perspective to the evidence that niche complementarity is critical in driving community assembly.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Let P be a probability distribution on q -dimensional space. The so-called Diaconis-Freedman effect means that for a fixed dimension d<distributions. The present paper provides necessary and sufficient conditions for this phenomenon in a suitable asymptotic framework with increasing dimension q . It turns out, that the conditions formulated by Diaconis and Freedman (1984) are not only sufficient but necessary as well. Moreover, letting P ^ be the empirical distribution of n independent random vectors with distribution P , we investigate the behavior of the empirical process n √ (P ^ −P) under random projections, conditional on P ^ .

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We introduce and analyze hp-version discontinuous Galerkin (dG) finite element methods for the numerical approximation of linear second-order elliptic boundary-value problems in three-dimensional polyhedral domains. To resolve possible corner-, edge- and corner-edge singularities, we consider hexahedral meshes that are geometrically and anisotropically refined toward the corresponding neighborhoods. Similarly, the local polynomial degrees are increased linearly and possibly anisotropically away from singularities. We design interior penalty hp-dG methods and prove that they are well-defined for problems with singular solutions and stable under the proposed hp-refinements. We establish (abstract) error bounds that will allow us to prove exponential rates of convergence in the second part of this work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The goal of this paper is to establish exponential convergence of $hp$-version interior penalty (IP) discontinuous Galerkin (dG) finite element methods for the numerical approximation of linear second-order elliptic boundary-value problems with homogeneous Dirichlet boundary conditions and piecewise analytic data in three-dimensional polyhedral domains. More precisely, we shall analyze the convergence of the $hp$-IP dG methods considered in [D. Schötzau, C. Schwab, T. P. Wihler, SIAM J. Numer. Anal., 51 (2013), pp. 1610--1633] based on axiparallel $\sigma$-geometric anisotropic meshes and $\bm{s}$-linear anisotropic polynomial degree distributions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While glucocorticoid (GC) administration appears to be beneficial during the acute phase of treatment of neonates at risk of developing chronic lung disease, it is still not clear whether steroid application has an adverse long-term effect on the lung maturation. Thus, the goal of the present work was to analyze GC effects on the pulmonary structure in a rat model where dosage and timing of drug administration were adapted to the therapeutic situation in human neonatology. The animals received daily a maximum of 0.1 mg dexamethasone phosphate per kilogram body weight during the first 4 postnatal days. Investigations were performed at the light microscopic level by means of a digital image analysis system. While there were no differences in the lung architecture between experimental animals and controls on day 4, the earliest time point of observation, we found a widening of airspaces with a concomitant decrease in the alveolar surface area density, representing a loss of parenchymal complexity, on days 10 and 21 in treated rats. On days 36 and 60, however, no alterations in the pulmonary parenchyma could be detected in experimental animals. We conclude from these findings that the GC-induced initial inhibition of development (days 10 and 21) was completely reversed, so that a normal parenchymal architecture and also a normal alveolar surface area density were found in adult rats (days 36 and 60). From the results obtained using the regimen of GC administration described, mimicking more closely the steroid treatment in human neonatology, we conclude that the observed short-term adverse effects on lung development can be fully compensated until adult age.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Groundwater age is a key aspect of production well vulnerability. Public drinking water supply wells typically have long screens and are expected to produce a mixture of groundwater ages. The groundwater age distributions of seven production wells of the Holten well field (Netherlands) were estimated from tritium-helium (3H/3He), krypton-85 (85Kr), and argon-39 (39Ar), using a new application of a discrete age distribution model and existing mathematical models, by minimizing the uncertainty-weighted squared differences of modeled and measured tracer concentrations. The observed tracer concentrations fitted well to a 4-bin discrete age distribution model or a dispersion model with a fraction of old groundwater. Our results show that more than 75 of the water pumped by four shallow production wells has a groundwater age of less than 20 years and these wells are very vulnerable to recent surface contamination. More than 50 of the water pumped by three deep production wells is older than 60 years. 3H/3He samples from short screened monitoring wells surrounding the well field constrained the age stratification in the aquifer. The discrepancy between the age stratification with depth and the groundwater age distribution of the production wells showed that the well field preferentially pumps from the shallow part of the aquifer. The discrete groundwater age distribution model appears to be a suitable approach in settings where the shape of the age distribution cannot be assumed to follow a simple mathematical model, such as a production well field where wells compete for capture area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Environmental data sets of pollutant concentrations in air, water, and soil frequently include unquantified sample values reported only as being below the analytical method detection limit. These values, referred to as censored values, should be considered in the estimation of distribution parameters as each represents some value of pollutant concentration between zero and the detection limit. Most of the currently accepted methods for estimating the population parameters of environmental data sets containing censored values rely upon the assumption of an underlying normal (or transformed normal) distribution. This assumption can result in unacceptable levels of error in parameter estimation due to the unbounded left tail of the normal distribution. With the beta distribution, which is bounded by the same range of a distribution of concentrations, $\rm\lbrack0\le x\le1\rbrack,$ parameter estimation errors resulting from improper distribution bounds are avoided. This work developed a method that uses the beta distribution to estimate population parameters from censored environmental data sets and evaluated its performance in comparison to currently accepted methods that rely upon an underlying normal (or transformed normal) distribution. Data sets were generated assuming typical values encountered in environmental pollutant evaluation for mean, standard deviation, and number of variates. For each set of model values, data sets were generated assuming that the data was distributed either normally, lognormally, or according to a beta distribution. For varying levels of censoring, two established methods of parameter estimation, regression on normal ordered statistics, and regression on lognormal ordered statistics, were used to estimate the known mean and standard deviation of each data set. The method developed for this study, employing a beta distribution assumption, was also used to estimate parameters and the relative accuracy of all three methods were compared. For data sets of all three distribution types, and for censoring levels up to 50%, the performance of the new method equaled, if not exceeded, the performance of the two established methods. Because of its robustness in parameter estimation regardless of distribution type or censoring level, the method employing the beta distribution should be considered for full development in estimating parameters for censored environmental data sets. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nuclear morphometry (NM) uses image analysis to measure features of the cell nucleus which are classified as: bulk properties, shape or form, and DNA distribution. Studies have used these measurements as diagnostic and prognostic indicators of disease with inconclusive results. The distributional properties of these variables have not been systematically investigated although much of the medical data exhibit nonnormal distributions. Measurements are done on several hundred cells per patient so summary measurements reflecting the underlying distribution are needed.^ Distributional characteristics of 34 NM variables from prostate cancer cells were investigated using graphical and analytical techniques. Cells per sample ranged from 52 to 458. A small sample of patients with benign prostatic hyperplasia (BPH), representing non-cancer cells, was used for general comparison with the cancer cells.^ Data transformations such as log, square root and 1/x did not yield normality as measured by the Shapiro-Wilks test for normality. A modulus transformation, used for distributions having abnormal kurtosis values, also did not produce normality.^ Kernel density histograms of the 34 variables exhibited non-normality and 18 variables also exhibited bimodality. A bimodality coefficient was calculated and 3 variables: DNA concentration, shape and elongation, showed the strongest evidence of bimodality and were studied further.^ Two analytical approaches were used to obtain a summary measure for each variable for each patient: cluster analysis to determine significant clusters and a mixture model analysis using a two component model having a Gaussian distribution with equal variances. The mixture component parameters were used to bootstrap the log likelihood ratio to determine the significant number of components, 1 or 2. These summary measures were used as predictors of disease severity in several proportional odds logistic regression models. The disease severity scale had 5 levels and was constructed of 3 components: extracapsulary penetration (ECP), lymph node involvement (LN+) and seminal vesicle involvement (SV+) which represent surrogate measures of prognosis. The summary measures were not strong predictors of disease severity. There was some indication from the mixture model results that there were changes in mean levels and proportions of the components in the lower severity levels. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Serial correlation of extreme midlatitude cyclones observed at the storm track exits is explained by deviations from a Poisson process. To model these deviations, we apply fractional Poisson processes (FPPs) to extreme midlatitude cyclones, which are defined by the 850 hPa relative vorticity of the ERA interim reanalysis during boreal winter (DJF) and summer (JJA) seasons. Extremes are defined by a 99% quantile threshold in the grid-point time series. In general, FPPs are based on long-term memory and lead to non-exponential return time distributions. The return times are described by a Weibull distribution to approximate the Mittag–Leffler function in the FPPs. The Weibull shape parameter yields a dispersion parameter that agrees with results found for midlatitude cyclones. The memory of the FPP, which is determined by detrended fluctuation analysis, provides an independent estimate for the shape parameter. Thus, the analysis exhibits a concise framework of the deviation from Poisson statistics (by a dispersion parameter), non-exponential return times and memory (correlation) on the basis of a single parameter. The results have potential implications for the predictability of extreme cyclones.