983 resultados para Inverse Gaussian Distribution


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Alterations in oncogenes and tumor suppressor genes (TSGs) are considered to be critical steps in oncogenesis. Consistent deletions and loss of heterozygosity (LOH) of polymorphic markers in a determinate chromosomal fragment are known to be indicative of a closely mapping TSG. Deletion of the long arm of chromosome 7 (hchr 7) is a frequent trait in many kinds of human primary tumors. LOH was studied with an extensive set of markers on chromosome 7q in several types of human neoplasias (primary breast, prostate, colon, ovarian and head and neck carcinomas) to determine the location of a putative TSG. The extent of LOH varied depending the type of tumor studied but all the LOH curves we obtained had a peak at (C-A)$\sb{\rm n}$ microsatellite repeat D7S522 at 7q31.1 and showed a Gaussian distribution. The high incidence of LOH in all tumor types studied suggests that a TSG relevant to the development of epithelial cancers is present on the 7q31.1. To investigate whether the putative TSG is conserved in the syntenic mouse locus, we studied LOH of 30 markers along mouse chromosome 6 (mchr 6) in chemically induced squamous cell carcinomas (SCCs). Tumors were obtained from SENCAR and C57BL/6 x DBA/2 F1 females by a two-stage carcinogenesis protocol. The high incidence of LOH in the tumor types studied suggests that a TSG relevant to the development of epithelial cancers is present on mchr 6 A1. Since this segment is syntenic with the hchr 7q31, these data indicate that the putative TSG is conserved in both species. Functional evidence for the existence of a TSG in hchr 7 was obtained by microcell fusion transfer of a single hchr 7 into a murine SCC-derived cell line. Five out of seven hybrids had two to three-fold longer latency periods for in vivo tumorigenicity assays than parental cells. One of the unrepressed hybrids had a deletion in the introduced chromosome 7 involving q31.1-q31.3, confirming the LOH data. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nuclear morphometry (NM) uses image analysis to measure features of the cell nucleus which are classified as: bulk properties, shape or form, and DNA distribution. Studies have used these measurements as diagnostic and prognostic indicators of disease with inconclusive results. The distributional properties of these variables have not been systematically investigated although much of the medical data exhibit nonnormal distributions. Measurements are done on several hundred cells per patient so summary measurements reflecting the underlying distribution are needed.^ Distributional characteristics of 34 NM variables from prostate cancer cells were investigated using graphical and analytical techniques. Cells per sample ranged from 52 to 458. A small sample of patients with benign prostatic hyperplasia (BPH), representing non-cancer cells, was used for general comparison with the cancer cells.^ Data transformations such as log, square root and 1/x did not yield normality as measured by the Shapiro-Wilks test for normality. A modulus transformation, used for distributions having abnormal kurtosis values, also did not produce normality.^ Kernel density histograms of the 34 variables exhibited non-normality and 18 variables also exhibited bimodality. A bimodality coefficient was calculated and 3 variables: DNA concentration, shape and elongation, showed the strongest evidence of bimodality and were studied further.^ Two analytical approaches were used to obtain a summary measure for each variable for each patient: cluster analysis to determine significant clusters and a mixture model analysis using a two component model having a Gaussian distribution with equal variances. The mixture component parameters were used to bootstrap the log likelihood ratio to determine the significant number of components, 1 or 2. These summary measures were used as predictors of disease severity in several proportional odds logistic regression models. The disease severity scale had 5 levels and was constructed of 3 components: extracapsulary penetration (ECP), lymph node involvement (LN+) and seminal vesicle involvement (SV+) which represent surrogate measures of prognosis. The summary measures were not strong predictors of disease severity. There was some indication from the mixture model results that there were changes in mean levels and proportions of the components in the lower severity levels. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Air was sampled from the porous firn layer at the NEEM site in Northern Greenland. We use an ensemble of ten reference tracers of known atmospheric history to characterise the transport properties of the site. By analysing uncertainties in both data and the reference gas atmospheric histories, we can objectively assign weights to each of the gases used for the depth-diffusivity reconstruction. We define an objective root mean square criterion that is minimised in the model tuning procedure. Each tracer constrains the firn profile differently through its unique atmospheric history and free air diffusivity, making our multiple-tracer characterisation method a clear improvement over the commonly used single-tracer tuning. Six firn air transport models are tuned to the NEEM site; all models successfully reproduce the data within a 1σ Gaussian distribution. A comparison between two replicate boreholes drilled 64 m apart shows differences in measured mixing ratio profiles that exceed the experimental error. We find evidence that diffusivity does not vanish completely in the lock-in zone, as is commonly assumed. The ice age- gas age difference (1 age) at the firn-ice transition is calculated to be 182+3−9 yr. We further present the first intercomparison study of firn air models, where we introduce diagnostic scenarios designed to probe specific aspects of the model physics. Our results show that there are major differences in the way the models handle advective transport. Furthermore, diffusive fractionation of isotopes in the firn is poorly constrained by the models, which has consequences for attempts to reconstruct the isotopic composition of trace gases back in time using firn air and ice core records.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Probabilistic modeling is the de�ning characteristic of estimation of distribution algorithms (EDAs) which determines their behavior and performance in optimization. Regularization is a well-known statistical technique used for obtaining an improved model by reducing the generalization error of estimation, especially in high-dimensional problems. `1-regularization is a type of this technique with the appealing variable selection property which results in sparse model estimations. In this thesis, we study the use of regularization techniques for model learning in EDAs. Several methods for regularized model estimation in continuous domains based on a Gaussian distribution assumption are presented, and analyzed from di�erent aspects when used for optimization in a high-dimensional setting, where the population size of EDA has a logarithmic scale with respect to the number of variables. The optimization results obtained for a number of continuous problems with an increasing number of variables show that the proposed EDA based on regularized model estimation performs a more robust optimization, and is able to achieve signi�cantly better results for larger dimensions than other Gaussian-based EDAs. We also propose a method for learning a marginally factorized Gaussian Markov random �eld model using regularization techniques and a clustering algorithm. The experimental results show notable optimization performance on continuous additively decomposable problems when using this model estimation method. Our study also covers multi-objective optimization and we propose joint probabilistic modeling of variables and objectives in EDAs based on Bayesian networks, speci�cally models inspired from multi-dimensional Bayesian network classi�ers. It is shown that with this approach to modeling, two new types of relationships are encoded in the estimated models in addition to the variable relationships captured in other EDAs: objectivevariable and objective-objective relationships. An extensive experimental study shows the e�ectiveness of this approach for multi- and many-objective optimization. With the proposed joint variable-objective modeling, in addition to the Pareto set approximation, the algorithm is also able to obtain an estimation of the multi-objective problem structure. Finally, the study of multi-objective optimization based on joint probabilistic modeling is extended to noisy domains, where the noise in objective values is represented by intervals. A new version of the Pareto dominance relation for ordering the solutions in these problems, namely �-degree Pareto dominance, is introduced and its properties are analyzed. We show that the ranking methods based on this dominance relation can result in competitive performance of EDAs with respect to the quality of the approximated Pareto sets. This dominance relation is then used together with a method for joint probabilistic modeling based on `1-regularization for multi-objective feature subset selection in classi�cation, where six di�erent measures of accuracy are considered as objectives with interval values. The individual assessment of the proposed joint probabilistic modeling and solution ranking methods on datasets with small-medium dimensionality, when using two di�erent Bayesian classi�ers, shows that comparable or better Pareto sets of feature subsets are approximated in comparison to standard methods.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents some initial attempts to mathematically model the dynamics of a continuous estimation of distribution algorithm (EDA) based on a Gaussian distribution and truncation selection. Case studies are conducted on both unimodal and multimodal problems to highlight the effectiveness of the proposed technique and explore some important properties of the EDA. With some general assumptions, we show that, for ID unimodal problems and with the (mu, lambda) scheme: (1). The behaviour of the EDA is dependent only on the general shape of the test function, rather than its specific form; (2). When initialized far from the global optimum, the EDA has a tendency to converge prematurely; (3). Given a certain selection pressure, there is a unique value for the proposed amplification parameter that could help the EDA achieve desirable performance; for ID multimodal problems: (1). The EDA could get stuck with the (mu, lambda) scheme; (2). The EDA will never get stuck with the (mu, lambda) scheme.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Using techniques from Statistical Physics, the annealed VC entropy for hyperplanes in high dimensional spaces is calculated as a function of the margin for a spherical Gaussian distribution of inputs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The effect of having a fixed differential group delay term in the coarse step method results in a periodic pattern in the inserting a varying DGD term at each integration step, according to a Gaussian distribution. Simulation results are given to illustrate the phenomenon and provide some evidence about its statistical nature.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The effect of having a fixed differential-group delay term in the coarse-step method results in a periodic pattern in the autocorrelation function. We solve this problem by inserting a varying DGD term at each integration step, according to a Gaussian distribution. Simulation results are given to illustrate the phenomenon and provide some evidence, about its statistical nature.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We introduce a general technique how to reveal in experiments of limited electrical bandwidth which is lower than the optical bandwidth of the optical signal under study, whether the statistical properties of the light source obey Gaussian distribution or mode correlations do exist. To do that one needs to perform measurements by decreasing the measurement bandwidth. We develop a simple model of bandwidth-limited measurements and predict universal laws how intensity probability density function and intensity auto-correlation function of ideal completely stochastic source of Gaussian statistics depend on limited measurement bandwidth and measurement noise level. Results of experimental investigation are in good agreement with model predictions. In particular, we reveal partial mode correlations in the radiation of quasi-CW Raman fibre laser.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We report an efficient power tapping device working in near infra-red (800 nm) wavelength region based on UV-in- scribed 45° tilted fiber grating (45°-TFG) structure. Five 45°-TFGs were UV-inscribed in hydrogenated PS750 fiber using a custom-designed phase mask with different grating lengths of 3 mm, 5 mm, 9 mm, 12 mm and 15 mm, showing polarization dependent losses (PDLs) of 1 dB, 3 dB, 7 dB, 10 dB and 13 dB, respectively. The power side-tapping efficiency is clearly depending on the grating strength. It has been identified that the power tapping efficiency increases with the grating strength and deceases along the grating length. The side-tapped power profile has also been examined in azimuthal direction, showing a near-Gaussian distribution. These experimental results clearly demonstrated that 45°- TFGs may be used as in-fiber power tapping devices for applications requiring in-line signal monitoring.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

* Research supported by NATO GRANT CRG 900 798 and by Humboldt Award for U.S. Scientists.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A detailed quantitative numerical analysis of partially coherent quasi-CW fiber laser is performed on the example of high-Q cavity Raman fiber laser. The key role of precise spectral performances of fiber Bragg gratings forming the laser cavity is clarified. It is shown that cross phase modulation between the pump and Stokes waves does not affect the generation. Amplitudes of different longitudinal modes strongly fluctuate obeying the Gaussian distribution. As intensity statistics is noticeably non-exponential, longitudinal modes should be correlated. © 2011 SPIE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The increase in renewable energy generators introduced into the electricity grid is putting pressure on its stability and management as predictions of renewable energy sources cannot be accurate or fully controlled. This, with the additional pressure of fluctuations in demand, presents a problem more complex than the current methods of controlling electricity distribution were designed for. A global approximate and distributed optimisation method for power allocation that accommodates uncertainties and volatility is suggested and analysed. It is based on a probabilistic method known as message passing [1], which has deep links to statistical physics methodology. This principled method of optimisation is based on local calculations and inherently accommodates uncertainties; it is of modest computational complexity and provides good approximate solutions.We consider uncertainty and fluctuations drawn from a Gaussian distribution and incorporate them into the message-passing algorithm. We see the effect that increasing uncertainty has on the transmission cost and how the placement of volatile nodes within a grid, such as renewable generators or consumers, effects it.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Most approaches to stereo visual odometry reconstruct the motion based on the tracking of point features along a sequence of images. However, in low-textured scenes it is often difficult to encounter a large set of point features, or it may happen that they are not well distributed over the image, so that the behavior of these algorithms deteriorates. This paper proposes a probabilistic approach to stereo visual odometry based on the combination of both point and line segment that works robustly in a wide variety of scenarios. The camera motion is recovered through non-linear minimization of the projection errors of both point and line segment features. In order to effectively combine both types of features, their associated errors are weighted according to their covariance matrices, computed from the propagation of Gaussian distribution errors in the sensor measurements. The method, of course, is computationally more expensive that using only one type of feature, but still can run in real-time on a standard computer and provides interesting advantages, including a straightforward integration into any probabilistic framework commonly employed in mobile robotics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the field of vibration qualification testing, with the popular Random Control mode of shakers, the specimen is excited by random vibrations typically set in the form of a Power Spectral Density (PSD). The corresponding signals are stationary and Gaussian, i.e. featuring a normal distribution. Conversely, real-life excitations are frequently non-Gaussian, exhibiting high peaks and/or burst signals and/or deterministic harmonic components. The so-called kurtosis is a parameter often used to statistically describe the occurrence and significance of high peak values in a random process. Since the similarity between test input profiles and real-life excitations is fundamental for qualification test reliability, some methods of kurtosis-control can be implemented to synthesize realistic (non-Gaussian) input signals. Durability tests are performed to check the resistance of a component to vibration-based fatigue damage. A procedure to synthesize test excitations which starts from measured data and preserves both the damage potential and the characteristics of the reference signals is desirable. The Fatigue Damage Spectrum (FDS) is generally used to quantify the fatigue damage potential associated with the excitation. The signal synthesized for accelerated durability tests (i.e. with a limited duration) must feature the same FDS as the reference vibration computed for the component’s expected lifetime. Current standard procedures are efficient in synthesizing signals in the form of a PSD, but prove inaccurate if reference data are non-Gaussian. This work presents novel algorithms for the synthesis of accelerated durability test profiles with prescribed FDS and a non-Gaussian distribution. An experimental campaign is conducted to validate the algorithms, by testing their accuracy, robustness, and practical effectiveness. Moreover, an original procedure is proposed for the estimation of the fatigue damage potential, aiming to minimize the computational time. The research is thus supposed to improve both the effectiveness and the efficiency of excitation profile synthesis for accelerated durability tests.