67 resultados para Cumulative Distribution Function


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The success rate of carrier phase ambiguity resolution (AR) is the probability that the ambiguities are successfully fixed to their correct integer values. In existing works, an exact success rate formula for integer bootstrapping estimator has been used as a sharp lower bound for the integer least squares (ILS) success rate. Rigorous computation of success rate for the more general ILS solutions has been considered difficult, because of complexity of the ILS ambiguity pull-in region and computational load of the integration of the multivariate probability density function. Contributions of this work are twofold. First, the pull-in region mathematically expressed as the vertices of a polyhedron is represented by a multi-dimensional grid, at which the cumulative probability can be integrated with the multivariate normal cumulative density function (mvncdf) available in Matlab. The bivariate case is studied where the pull-region is usually defined as a hexagon and the probability is easily obtained using mvncdf at all the grid points within the convex polygon. Second, the paper compares the computed integer rounding and integer bootstrapping success rates, lower and upper bounds of the ILS success rates to the actual ILS AR success rates obtained from a 24 h GPS data set for a 21 km baseline. The results demonstrate that the upper bound probability of the ILS AR probability given in the existing literatures agrees with the actual ILS success rate well, although the success rate computed with integer bootstrapping method is a quite sharp approximation to the actual ILS success rate. The results also show that variations or uncertainty of the unit–weight variance estimates from epoch to epoch will affect the computed success rates from different methods significantly, thus deserving more attentions in order to obtain useful success probability predictions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Since the availability of 3D full body scanners and the associated software systems for operations with large point clouds, 3D anthropometry has been marketed as a breakthrough and milestone in ergonomic design. The assumptions made by the representatives of the 3D paradigm need to be critically reviewed though. 3D anthropometry has advantages as well as shortfalls, which need to be carefully considered. While it is apparent that the measurement of a full body point cloud allows for easier storage of raw data and improves quality control, the difficulties in calculation of standardized measurements from the point cloud are widely underestimated. Early studies that made use of 3D point clouds to derive anthropometric dimensions have shown unacceptable deviations from the standardized results measured manually. While 3D human point clouds provide a valuable tool to replicate specific single persons for further virtual studies, or personalize garment, their use in ergonomic design must be critically assessed. Ergonomic, volumetric problems are defined by their 2-dimensional boundary or one dimensional sections. A 1D/2D approach is therefore sufficient to solve an ergonomic design problem. As a consequence, all modern 3D human manikins are defined by the underlying anthropometric girths (2D) and lengths/widths (1D), which can be measured efficiently using manual techniques. Traditionally, Ergonomists have taken a statistical approach to design for generalized percentiles of the population rather than for a single user. The underlying method is based on the distribution function of meaningful single and two-dimensional anthropometric variables. Compared to these variables, the distribution of human volume has no ergonomic relevance. On the other hand, if volume is to be seen as a two-dimensional integral or distribution function of length and girth, the calculation of combined percentiles – a common ergonomic requirement - is undefined. Consequently, we suggest to critically review the cost and use of 3D anthropometry. We also recommend making proper use of widely available single and 2-dimensional anthropometric data in ergonomic design.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Local climate is a critical element in the design of buildings. In this paper, ten years of historical weather data in Australia's all eight capital cities are analyzed to characterize the variation profiles of climatic variables. The method of descriptive statistics is employed. Either the pattern of cumulative distribution and/or the profile of percentage distribution are used to graphically illustrate the similarity and difference between different study locations. It is found that although the weather variables vary with different locations, except for the extreme parts, there is often a good, nearly linear relation between weather variable and its cumulative percentage for the majority of middle part. The implication of these extreme parts and the slopes of the middle parts on building design is also discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The available wind power is stochastic and requires appropriate tools in the OPF model for economic and reliable power system operation. This paper exhibit the OPF formulation with factors involved in the intermittency of wind power. Weibull distribution is adopted to find the stochastic wind speed and power distribution. The reserve requirement is evaluated based on the wind distribution and risk of under/over estimation of the wind power. In addition, the Wind Energy Conversion System (WECS) is represented by Doubly Fed Induction Generator (DFIG) based wind farms. The reactive power capability for DFIG based wind farm is also analyzed. The study is performed on IEEE-30 bus system with wind farm located at different buses and with different wind profiles. Also the reactive power capacity to be installed in the wind farm to maintain a satisfactory voltage profile under the various wind flow scenario is demonstrated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Through a combinatorial approach involving experimental measurement and plasma modelling, it is shown that a high degree of control over diamond-like nanocarbon film sp3/sp2 ratio (and hence film properties) may be exercised, starting at the level of electrons (through modification of the plasma electron energy distribution function). Hydrogenated amorphous carbon nanoparticle films with high percentages of diamond-like bonds are grown using a middle-frequency (2 MHz) inductively coupled Ar + CH4 plasma. The sp3 fractions measured by X-ray photoelectron spectroscopy (XPS) and Raman spectroscopy in the thin films are explained qualitatively using sp3/sp2 ratios 1) derived from calculated sp3 and sp2 hybridized precursor species densities in a global plasma discharge model and 2) measured experimentally. It is shown that at high discharge power and lower CH4 concentrations, the sp3/sp2 fraction is higher. Our results suggest that a combination of predictive modeling and experimental studies is instrumental to achieve deterministically grown made-to-order diamond-like nanocarbons suitable for a variety of applications spanning from nano-magnetic resonance imaging to spin-flip quantum information devices. This deterministic approach can be extended to graphene, carbon nanotips, nanodiamond and other nanocarbon materials for a variety of applications

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Reliable calculations of the electron/ion energy losses in low-pressure thermally nonequilibrium low-temperature plasmas are indispensable for predictive modeling related to numerous applications of such discharges. The commonly used simplified approaches to calculation of electron/ion energy losses to the chamber walls use a number of simplifying assumptions that often do not account for the details of the prevailing electron energy distribution function (EEDF) and overestimate the contributions of the electron losses to the walls. By direct measurements of the EEDF and careful calculation of contributions of the plasma electrons in low-pressure inductively coupled plasmas, it is shown that the actual losses of kinetic energy of the electrons and ions strongly depend on the EEDF. It is revealed that the overestimates of the total electron/ion energy losses to the walls caused by improper assumptions about the prevailing EEDF and about the ability of the electrons to pass through the repulsive potential of the wall may lead to significant overestimates that are typically in the range between 9 and 32%. These results are particularly important for the development of power-saving strategies for operation of low-temperature, low-pressure gas discharges in diverse applications that require reasonably low power densities. © 2008 American Institute of Physics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A complex low-pressure argon discharge plasma containing dust grains is studied using a Boltzmann equation for the electrons and fluid equations for the ions. Local effects, such as the spatial distribution of the dust density and external electric field, are included, and their effect on the electron energy distribution, the electron and ion number densities, the electron temperature, and the dust charge are investigated. It is found that dust particles can strongly affect the plasma parameters by modifying the electron energy distribution, the electron temperature, the creation and loss of plasma particles, as well as the spatial distributions of the electrons and ions. In particular, for sufficiently high grain density and/or size, in a low-pressure argon glow discharge, the Druyvesteyn-like electron distribution in pristine plasmas can become nearly Maxwellian. Electron collection by the dust grains is the main cause for the change in the electron distribution function.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We incorporated a new Riemannian fluid registration algorithm into a general MRI analysis method called tensor-based morphometry to map the heritability of brain morphology in MR images from 23 monozygotic and 23 dizygotic twin pairs. All 92 3D scans were fluidly registered to a common template. Voxelwise Jacobian determinants were computed from the deformation fields to assess local volumetric differences across subjects. Heritability maps were computed from the intraclass correlations and their significance was assessed using voxelwise permutation tests. Lobar volume heritability was also studied using the ACE genetic model. The performance of this Riemannian algorithm was compared to a more standard fluid registration algorithm: 3D maps from both registration techniques displayed similar heritability patterns throughout the brain. Power improvements were quantified by comparing the cumulative distribution functions of the p-values generated from both competing methods. The Riemannian algorithm outperformed the standard fluid registration.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Robust and automatic non-rigid registration depends on many parameters that have not yet been systematically explored. Here we determined how tissue classification influences non-linear fluid registration of brain MRI. Twin data is ideal for studying this question, as volumetric correlations between corresponding brain regions that are under genetic control should be higher in monozygotic twins (MZ) who share 100% of their genes when compared to dizygotic twins (DZ) who share half their genes on average. When these substructure volumes are quantified using tensor-based morphometry, improved registration can be defined based on which method gives higher MZ twin correlations when compared to DZs, as registration errors tend to deplete these correlations. In a study of 92 subjects, higher effect sizes were found in cumulative distribution functions derived from statistical maps when performing tissue classification before fluid registration, versus fluidly registering the raw images. This gives empirical evidence in favor of pre-segmenting images for tensor-based morphometry.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We used diffusion tensor magnetic resonance imaging (DTI) to reveal the extent of genetic effects on brain fiber microstructure, based on tensor-derived measures, in 22 pairs of monozygotic (MZ) twins and 23 pairs of dizygotic (DZ) twins (90 scans). After Log-Euclidean denoising to remove rank-deficient tensors, DTI volumes were fluidly registered by high-dimensional mapping of co-registered MP-RAGE scans to a geometrically-centered mean neuroanatomical template. After tensor reorientation using the strain of the 3D fluid transformation, we computed two widely used scalar measures of fiber integrity: fractional anisotropy (FA), and geodesic anisotropy (GA), which measures the geodesic distance between tensors in the symmetric positive-definite tensor manifold. Spatial maps of intraclass correlations (r) between MZ and DZ twins were compared to compute maps of Falconer's heritability statistics, i.e. the proportion of population variance explainable by genetic differences among individuals. Cumulative distribution plots (CDF) of effect sizes showed that the manifold measure, GA, comparably the Euclidean measure, FA, in detecting genetic correlations. While maps were relatively noisy, the CDFs showed promise for detecting genetic influences on brain fiber integrity as the current sample expands.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fractional anisotropy (FA), a very widely used measure of fiber integrity based on diffusion tensor imaging (DTI), is a problematic concept as it is influenced by several quantities including the number of dominant fiber directions within each voxel, each fiber's anisotropy, and partial volume effects from neighboring gray matter. High-angular resolution diffusion imaging (HARDI) can resolve more complex diffusion geometries than standard DTI, including fibers crossing or mixing. The tensor distribution function (TDF) can be used to reconstruct multiple underlying fibers per voxel, representing the diffusion profile as a probabilistic mixture of tensors. Here we found that DTIderived mean diffusivity (MD) correlates well with actual individual fiber MD, but DTI-derived FA correlates poorly with actual individual fiber anisotropy, and may be suboptimal when used to detect disease processes that affect myelination. Analysis of the TDFs revealed that almost 40% of voxels in the white matter had more than one dominant fiber present. To more accurately assess fiber integrity in these cases, we here propose the differential diffusivity (DD), which measures the average anisotropy based on all dominant directions in each voxel.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to characterise and quantify the fungal fragment propagules derived and released from several fungal species (Penicillium, Aspergillus niger and Cladosporium cladosporioides) using different generation methods and different air velocities over the colonies. Real time fungal spore fragmentation was investigated using an Ultraviolet Aerodynamic Particle Sizer (UVASP) and a Scanning Mobility Particle Sizer (SMPS). The study showed that there were significant differences (p < 0.01) in the fragmentation percentage between different air velocities for the three generation methods, namely the direct, the fan and the fungal spore source strength tester (FSSST) methods. The percentage of fragmentation also proved to be dependant on fungal species. The study found that there was no fragmentation for any of the fungal species at an air velocity ≤ 0.4 m/s for any method of generation. Fluorescent signals, as well as mathematical determination also showed that the fungal fragments were derived from spores. Correlation analysis showed that the number of released fragments measured by the UVAPS under controlled conditions can be predicted on the basis of the number of spores, for Penicillium and Aspergillus niger, but not for Cladosporium cladosporioides. The fluorescence percentage of fragment samples was found to be significantly different to that of non-fragment samples (p < 0.0001) and the fragment sample fluorescence was always less than that of the non-fragment samples. Size distribution and concentration of fungal fragment particles were investigated qualitatively and quantitatively, by both UVAPS and SMPS, and it was found that the UVAPS was more sensitive than the SMPS for measuring small sample concentrations, and the results obtained from the UVAPS and SMAS were not identical for the same samples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports the initial steps of research on planning of rural networks for MV and LV. In this paper, two different cases are studied. In the first case, 100 loads are distributed uniformly on a 100 km transmission line in a distribution network and in the second case, the load structure become closer to the rural situation. In case 2, 21 loads are located in a distribution system so that their distance is increasing, distance between load 1 and 2 is 3 km, between 2 and 3 is 6 km, etc). These two models to some extent represent the distribution system in urban and rural areas, respectively. The objective function for the design of the optimal system consists of three main parts: cost of transformers, and MV and LV conductors. The bus voltage is expressed as a constraint and should be maintained within a standard level, rising or falling by no more than 5%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, the placement of sectionalizers, as well as, a cross-connection is optimally determined so that the objective function is minimized. The objective function employed in this paper consists of two main parts, the switch cost and the reliability cost. The switch cost is composed of the cost of sectionalizers and cross-connection and the reliability cost is assumed to be proportional to a reliability index, SAIDI. To optimize the allocation of sectionalizers and cross-connection problem realistically, the cost related to each element is considered as discrete. In consequence of binary variables for the availability of sectionalizers, the problem is extremely discrete. Therefore, the probability of local minimum risk is high and a heuristic-based optimization method is needed. A Discrete Particle Swarm Optimization (DPSO) is employed in this paper to deal with this discrete problem. Finally, a testing distribution system is used to validate the proposed method.