913 resultados para Cumulative Distribution Function
Resumo:
We examined the anatomy of expanding, mature, and senescing leaves of tropical plants for the presence of red pigments: anthocyanins and betacyanins. We studied 463 species in total, 370 genera, belonging to 94 families. This included 21 species from five families in the Caryophyllales, where betacyanins are the basis for red color. We also included 14 species of ferns and gymnosperms in seven families and 29 species with undersurface coloration at maturity. We analyzed 399 angiosperm species (74 families) for factors (especially developmental and evolutionary) influencing anthocyanin production during expansion and senescence. During expansion, 44.9% produced anthocyanins and only 13.5% during senescence. At both stages, relatively few patterns of tissue distributions developed, primarily in the mesophyll, and very few taxa produced anthocyanins in dermal and ground tissue simultaneously. Of the 35 species producing anthocyanins both in development and senescence, most had similar cellular distributions. Anthocyanin distributions were identical in different developing leaves of three heteroblastic taxa. Phylogeny has influenced the distribution of anthocyanins in the epidermis and mesophyll of expanding leaves and the palisade parenchyma during senescence, although these influences are not strong. Betacyanins appear to have similar distributions in leaves of taxa within the Caryophyllales and, perhaps, similar functions. The presence of anthocyanins in the mesophyll of so many species is inconsistent with the hypothesis of protection against UV damage or fungal pathogens, and the differing tissue distributions indicate that the pigments may function in different ways, as in photoprotection and freeradical scavenging.
Resumo:
Tropical Cyclones are a continuing threat to life and property. Willoughby (2012) found that a Pareto (power-law) cumulative distribution fitted to the most damaging 10% of US hurricane seasons fit their impacts well. Here, we find that damage follows a Pareto distribution because the assets at hazard follow a Zipf distribution, which can be thought of as a Pareto distribution with exponent 1. The Z-CAT model is an idealized hurricane catastrophe model that represents a coastline where populated places with Zipf- distributed assets are randomly scattered and damaged by virtual hurricanes with sizes and intensities generated through a Monte-Carlo process. Results produce realistic Pareto exponents. The ability of the Z-CAT model to simulate different climate scenarios allowed testing of sensitivities to Maximum Potential Intensity, landfall rates and building structure vulnerability. The Z-CAT model results demonstrate that a statistical significant difference in damage is found when only changes in the parameters create a doubling of damage.
Resumo:
This work presents a computational, called MOMENTS, code developed to be used in process control to determine a characteristic transfer function to industrial units when radiotracer techniques were been applied to study the unit´s performance. The methodology is based on the measuring the residence time distribution function (RTD) and calculate the first and second temporal moments of the tracer data obtained by two scintillators detectors NaI positioned to register a complete tracer movement inside the unit. Non linear regression technique has been used to fit various mathematical models and a statistical test was used to select the best result to the transfer function. Using the code MOMENTS, twelve different models can be used to fit a curve and calculate technical parameters to the unit.
Resumo:
Parallel combinatory orthogonal frequency division multiplexing (PC-OFDM yields lower maximum peak-to-average power ratio (PAR), high bandwidth efficiency and lower bit error rate (BER) on Gaussian channels compared to OFDM systems. However, PC-OFDM does not improve the statistics of PAR significantly. In this chapter, the use of a set of fixed permutations to improve the statistics of the PAR of a PC-OFDM signal is presented. For this technique, interleavers are used to produce K-1 permuted sequences from the same information sequence. The sequence with the lowest PAR, among K sequences is chosen for the transmission. The PAR of a PC-OFDM signal can be further reduced by 3-4 dB by this technique. Mathematical expressions for the complementary cumulative density function (CCDF)of PAR of PC-OFDM signal and interleaved PC-OFDM signal are also presented.
Resumo:
The success rate of carrier phase ambiguity resolution (AR) is the probability that the ambiguities are successfully fixed to their correct integer values. In existing works, an exact success rate formula for integer bootstrapping estimator has been used as a sharp lower bound for the integer least squares (ILS) success rate. Rigorous computation of success rate for the more general ILS solutions has been considered difficult, because of complexity of the ILS ambiguity pull-in region and computational load of the integration of the multivariate probability density function. Contributions of this work are twofold. First, the pull-in region mathematically expressed as the vertices of a polyhedron is represented by a multi-dimensional grid, at which the cumulative probability can be integrated with the multivariate normal cumulative density function (mvncdf) available in Matlab. The bivariate case is studied where the pull-region is usually defined as a hexagon and the probability is easily obtained using mvncdf at all the grid points within the convex polygon. Second, the paper compares the computed integer rounding and integer bootstrapping success rates, lower and upper bounds of the ILS success rates to the actual ILS AR success rates obtained from a 24 h GPS data set for a 21 km baseline. The results demonstrate that the upper bound probability of the ILS AR probability given in the existing literatures agrees with the actual ILS success rate well, although the success rate computed with integer bootstrapping method is a quite sharp approximation to the actual ILS success rate. The results also show that variations or uncertainty of the unit–weight variance estimates from epoch to epoch will affect the computed success rates from different methods significantly, thus deserving more attentions in order to obtain useful success probability predictions.
Resumo:
Since the availability of 3D full body scanners and the associated software systems for operations with large point clouds, 3D anthropometry has been marketed as a breakthrough and milestone in ergonomic design. The assumptions made by the representatives of the 3D paradigm need to be critically reviewed though. 3D anthropometry has advantages as well as shortfalls, which need to be carefully considered. While it is apparent that the measurement of a full body point cloud allows for easier storage of raw data and improves quality control, the difficulties in calculation of standardized measurements from the point cloud are widely underestimated. Early studies that made use of 3D point clouds to derive anthropometric dimensions have shown unacceptable deviations from the standardized results measured manually. While 3D human point clouds provide a valuable tool to replicate specific single persons for further virtual studies, or personalize garment, their use in ergonomic design must be critically assessed. Ergonomic, volumetric problems are defined by their 2-dimensional boundary or one dimensional sections. A 1D/2D approach is therefore sufficient to solve an ergonomic design problem. As a consequence, all modern 3D human manikins are defined by the underlying anthropometric girths (2D) and lengths/widths (1D), which can be measured efficiently using manual techniques. Traditionally, Ergonomists have taken a statistical approach to design for generalized percentiles of the population rather than for a single user. The underlying method is based on the distribution function of meaningful single and two-dimensional anthropometric variables. Compared to these variables, the distribution of human volume has no ergonomic relevance. On the other hand, if volume is to be seen as a two-dimensional integral or distribution function of length and girth, the calculation of combined percentiles – a common ergonomic requirement - is undefined. Consequently, we suggest to critically review the cost and use of 3D anthropometry. We also recommend making proper use of widely available single and 2-dimensional anthropometric data in ergonomic design.
Resumo:
Local climate is a critical element in the design of buildings. In this paper, ten years of historical weather data in Australia's all eight capital cities are analyzed to characterize the variation profiles of climatic variables. The method of descriptive statistics is employed. Either the pattern of cumulative distribution and/or the profile of percentage distribution are used to graphically illustrate the similarity and difference between different study locations. It is found that although the weather variables vary with different locations, except for the extreme parts, there is often a good, nearly linear relation between weather variable and its cumulative percentage for the majority of middle part. The implication of these extreme parts and the slopes of the middle parts on building design is also discussed.
Resumo:
The available wind power is stochastic and requires appropriate tools in the OPF model for economic and reliable power system operation. This paper exhibit the OPF formulation with factors involved in the intermittency of wind power. Weibull distribution is adopted to find the stochastic wind speed and power distribution. The reserve requirement is evaluated based on the wind distribution and risk of under/over estimation of the wind power. In addition, the Wind Energy Conversion System (WECS) is represented by Doubly Fed Induction Generator (DFIG) based wind farms. The reactive power capability for DFIG based wind farm is also analyzed. The study is performed on IEEE-30 bus system with wind farm located at different buses and with different wind profiles. Also the reactive power capacity to be installed in the wind farm to maintain a satisfactory voltage profile under the various wind flow scenario is demonstrated.
Resumo:
Through a combinatorial approach involving experimental measurement and plasma modelling, it is shown that a high degree of control over diamond-like nanocarbon film sp3/sp2 ratio (and hence film properties) may be exercised, starting at the level of electrons (through modification of the plasma electron energy distribution function). Hydrogenated amorphous carbon nanoparticle films with high percentages of diamond-like bonds are grown using a middle-frequency (2 MHz) inductively coupled Ar + CH4 plasma. The sp3 fractions measured by X-ray photoelectron spectroscopy (XPS) and Raman spectroscopy in the thin films are explained qualitatively using sp3/sp2 ratios 1) derived from calculated sp3 and sp2 hybridized precursor species densities in a global plasma discharge model and 2) measured experimentally. It is shown that at high discharge power and lower CH4 concentrations, the sp3/sp2 fraction is higher. Our results suggest that a combination of predictive modeling and experimental studies is instrumental to achieve deterministically grown made-to-order diamond-like nanocarbons suitable for a variety of applications spanning from nano-magnetic resonance imaging to spin-flip quantum information devices. This deterministic approach can be extended to graphene, carbon nanotips, nanodiamond and other nanocarbon materials for a variety of applications
Resumo:
Reliable calculations of the electron/ion energy losses in low-pressure thermally nonequilibrium low-temperature plasmas are indispensable for predictive modeling related to numerous applications of such discharges. The commonly used simplified approaches to calculation of electron/ion energy losses to the chamber walls use a number of simplifying assumptions that often do not account for the details of the prevailing electron energy distribution function (EEDF) and overestimate the contributions of the electron losses to the walls. By direct measurements of the EEDF and careful calculation of contributions of the plasma electrons in low-pressure inductively coupled plasmas, it is shown that the actual losses of kinetic energy of the electrons and ions strongly depend on the EEDF. It is revealed that the overestimates of the total electron/ion energy losses to the walls caused by improper assumptions about the prevailing EEDF and about the ability of the electrons to pass through the repulsive potential of the wall may lead to significant overestimates that are typically in the range between 9 and 32%. These results are particularly important for the development of power-saving strategies for operation of low-temperature, low-pressure gas discharges in diverse applications that require reasonably low power densities. © 2008 American Institute of Physics.
Resumo:
A complex low-pressure argon discharge plasma containing dust grains is studied using a Boltzmann equation for the electrons and fluid equations for the ions. Local effects, such as the spatial distribution of the dust density and external electric field, are included, and their effect on the electron energy distribution, the electron and ion number densities, the electron temperature, and the dust charge are investigated. It is found that dust particles can strongly affect the plasma parameters by modifying the electron energy distribution, the electron temperature, the creation and loss of plasma particles, as well as the spatial distributions of the electrons and ions. In particular, for sufficiently high grain density and/or size, in a low-pressure argon glow discharge, the Druyvesteyn-like electron distribution in pristine plasmas can become nearly Maxwellian. Electron collection by the dust grains is the main cause for the change in the electron distribution function.
Resumo:
We incorporated a new Riemannian fluid registration algorithm into a general MRI analysis method called tensor-based morphometry to map the heritability of brain morphology in MR images from 23 monozygotic and 23 dizygotic twin pairs. All 92 3D scans were fluidly registered to a common template. Voxelwise Jacobian determinants were computed from the deformation fields to assess local volumetric differences across subjects. Heritability maps were computed from the intraclass correlations and their significance was assessed using voxelwise permutation tests. Lobar volume heritability was also studied using the ACE genetic model. The performance of this Riemannian algorithm was compared to a more standard fluid registration algorithm: 3D maps from both registration techniques displayed similar heritability patterns throughout the brain. Power improvements were quantified by comparing the cumulative distribution functions of the p-values generated from both competing methods. The Riemannian algorithm outperformed the standard fluid registration.
Resumo:
Robust and automatic non-rigid registration depends on many parameters that have not yet been systematically explored. Here we determined how tissue classification influences non-linear fluid registration of brain MRI. Twin data is ideal for studying this question, as volumetric correlations between corresponding brain regions that are under genetic control should be higher in monozygotic twins (MZ) who share 100% of their genes when compared to dizygotic twins (DZ) who share half their genes on average. When these substructure volumes are quantified using tensor-based morphometry, improved registration can be defined based on which method gives higher MZ twin correlations when compared to DZs, as registration errors tend to deplete these correlations. In a study of 92 subjects, higher effect sizes were found in cumulative distribution functions derived from statistical maps when performing tissue classification before fluid registration, versus fluidly registering the raw images. This gives empirical evidence in favor of pre-segmenting images for tensor-based morphometry.
Resumo:
We used diffusion tensor magnetic resonance imaging (DTI) to reveal the extent of genetic effects on brain fiber microstructure, based on tensor-derived measures, in 22 pairs of monozygotic (MZ) twins and 23 pairs of dizygotic (DZ) twins (90 scans). After Log-Euclidean denoising to remove rank-deficient tensors, DTI volumes were fluidly registered by high-dimensional mapping of co-registered MP-RAGE scans to a geometrically-centered mean neuroanatomical template. After tensor reorientation using the strain of the 3D fluid transformation, we computed two widely used scalar measures of fiber integrity: fractional anisotropy (FA), and geodesic anisotropy (GA), which measures the geodesic distance between tensors in the symmetric positive-definite tensor manifold. Spatial maps of intraclass correlations (r) between MZ and DZ twins were compared to compute maps of Falconer's heritability statistics, i.e. the proportion of population variance explainable by genetic differences among individuals. Cumulative distribution plots (CDF) of effect sizes showed that the manifold measure, GA, comparably the Euclidean measure, FA, in detecting genetic correlations. While maps were relatively noisy, the CDFs showed promise for detecting genetic influences on brain fiber integrity as the current sample expands.
Resumo:
Fractional anisotropy (FA), a very widely used measure of fiber integrity based on diffusion tensor imaging (DTI), is a problematic concept as it is influenced by several quantities including the number of dominant fiber directions within each voxel, each fiber's anisotropy, and partial volume effects from neighboring gray matter. High-angular resolution diffusion imaging (HARDI) can resolve more complex diffusion geometries than standard DTI, including fibers crossing or mixing. The tensor distribution function (TDF) can be used to reconstruct multiple underlying fibers per voxel, representing the diffusion profile as a probabilistic mixture of tensors. Here we found that DTIderived mean diffusivity (MD) correlates well with actual individual fiber MD, but DTI-derived FA correlates poorly with actual individual fiber anisotropy, and may be suboptimal when used to detect disease processes that affect myelination. Analysis of the TDFs revealed that almost 40% of voxels in the white matter had more than one dominant fiber present. To more accurately assess fiber integrity in these cases, we here propose the differential diffusivity (DD), which measures the average anisotropy based on all dominant directions in each voxel.