965 resultados para Cumulative Distribution Function


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates a simple procedure to estimate robustly the mean of an asymmetric distribution. The procedure removes the observations which are larger or smaller than certain limits and takes the arithmetic mean of the remaining observations, the limits being determined with the help of a parametric model, e.g., the Gamma, the Weibull or the Lognormal distribution. The breakdown point, the influence function, the (asymptotic) variance, and the contamination bias of this estimator are explored and compared numerically with those of competing estimates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a novel approach for the analysis of illicit tablets based on their visual characteristics. In particular, the paper concentrates on the problem of ecstasy pill seizure profiling and monitoring. The presented method extracts the visual information from pill images and builds a representation of it, i.e. it builds a pill profile based on the pill visual appearance. Different visual features are used to build different image similarity measures, which are the basis for a pill monitoring strategy based on both discriminative and clustering models. The discriminative model permits to infer whether two pills come from the same seizure, while the clustering models groups of pills that share similar visual characteristics. The resulting clustering structure allows to perform a visual identification of the relationships between different seizures. The proposed approach was evaluated using a data set of 621 Ecstasy pill pictures. The results demonstrate that this is a feasible and cost effective method for performing pill profiling and monitoring.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a distributed key distribution scheme, a set of servers helps a set of users in a group to securely obtain a common key. Security means that an adversary who corrupts some servers and some users has no information about the key of a noncorrupted group. In this work, we formalize the security analysis of one such scheme which was not considered in the original proposal. We prove the scheme is secure in the random oracle model, assuming that the Decisional Diffie-Hellman (DDH) problem is hard to solve. We also detail a possible modification of that scheme and the one in which allows us to prove the security of the schemes without assuming that a specific hash function behaves as a random oracle. As usual, this improvement in the security of the schemes is at the cost of an efficiency loss.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Quantitative Microbial Risk Assessment, it is vital to understand how lag times of individual cells are distributed over a bacterial population. Such identified distributions can be used to predict the time by which, in a growth-supporting environment, a few pathogenic cells can multiply to a poisoning concentration level. We model the lag time of a single cell, inoculated into a new environment, by the delay of the growth function characterizing the generated subpopulation. We introduce an easy-to-implement procedure, based on the method of moments, to estimate the parameters of the distribution of single cell lag times. The advantage of the method is especially apparent for cases where the initial number of cells is small and random, and the culture is detectable only in the exponential growth phase.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper shows that the distribution of observed consumption is not a good proxy for the distribution of heterogeneous consumers when the current tariff is an increasing block tariff. We use a two step method to recover the "true" distribution of consumers. First, we estimate the demand function induced by the current tariff. Second, using the demand system, we specify the distribution of consumers as a function of observed consumption to recover the true distribution. Finally, we design a new two-part tariff which allows us to evaluate the equity of the existence of an increasing block tariff.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT We investigated the distribution, morphology and abundance of antennae sensilla of Coboldia fuscipes (Meigen) using scanning electron microscopy. Antennae of C. fuscipes consisted of scape, pedicel, and flagellum with eight flagellomeres. Antennal scape and pedicel had only one type of sensillum, i.e., sensilla chaetica. Significant differences were found between the number and distribution of these sensilla. Four types of morphologically distinct sensilla on the flagellum were identified, including sensilla chaetica, sensilla trichoidea, sensilla coeloconica, and sensilla basiconica (three subtypes). Significant differences were found in the abundance and distribution of sensilla among the antennal flagella and diverse flagellomeres in both sexes. Sensilla trichoidea is the most abundant of sensilla discovered on the antennal flagellum. Sensilla chaetica is the largest and longest sensilla among all the types of sensilla found on the antennal surface of C. fuscipes. Sensilla coeloconica is widely distributed all over the flagellum surface except for the first of female. Some significant differences in the abundance and distribution were also observed among sensilla basiconica of flagellum. The probable biological function of each sensillum type was deduced based on the basis of their structure. These results serve as important basis for further studies on the host location mechanism and mating behavior of C. fuscipes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the last 2 years, several novel genes that encode glucose transporter-like proteins have been identified and characterized. Because of their sequence similarity with GLUT1, these genes appear to belong to the family of solute carriers 2A (SLC2A, protein symbol GLUT). Sequence comparisons of all 13 family members allow the definition of characteristic sugar/polyol transporter signatures: (1) the presence of 12 membrane-spanning helices, (2) seven conserved glycine residues in the helices, (3) several basic and acidic residues at the intracellular surface of the proteins, (4) two conserved tryptophan residues, and (5) two conserved tyrosine residues. On the basis of sequence similarities and characteristic elements, the extended GLUT family can be divided into three subfamilies, namely class I (the previously known glucose transporters GLUT1-4), class II (the previously known fructose transporter GLUT5, the GLUT7, GLUT9 and GLUT11), and class III (GLUT6, 8, 10, 12, and the myo-inositol transporter HMIT1). Functional characteristics have been reported for some of the novel GLUTs. Like GLUT1-4, they exhibit a tissue/cell-specific expression (GLUT6, leukocytes, brain; GLUT8, testis, blastocysts, brain, muscle, adipocytes; GLUT9, liver, kidney; GLUT10, liver, pancreas; GLUT11, heart, skeletal muscle). GLUT6 and GLUT8 appear to be regulated by sub-cellular redistribution, because they are targeted to intra-cellular compartments by dileucine motifs in a dynamin dependent manner. Sugar transport has been reported for GLUT6, 8, and 11; HMIT1 has been shown to be a H+/myo-inositol co-transporter. Thus, the members of the extended GLUT family exhibit a surprisingly diverse substrate specificity, and the definition of sequence elements determining this substrate specificity will require a full functional characterization of all members.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many traits and/or strategies expressed by organisms are quantitative phenotypes. Because populations are of finite size and genomes are subject to mutations, these continuously varying phenotypes are under the joint pressure of mutation, natural selection and random genetic drift. This article derives the stationary distribution for such a phenotype under a mutation-selection-drift balance in a class-structured population allowing for demographically varying class sizes and/or changing environmental conditions. The salient feature of the stationary distribution is that it can be entirely characterized in terms of the average size of the gene pool and Hamilton's inclusive fitness effect. The exploration of the phenotypic space varies exponentially with the cumulative inclusive fitness effect over state space, which determines an adaptive landscape. The peaks of the landscapes are those phenotypes that are candidate evolutionary stable strategies and can be determined by standard phenotypic selection gradient methods (e.g. evolutionary game theory, kin selection theory, adaptive dynamics). The curvature of the stationary distribution provides a measure of the stability by convergence of candidate evolutionary stable strategies, and it is evaluated explicitly for two biological scenarios: first, a coordination game, which illustrates that, for a multipeaked adaptive landscape, stochastically stable strategies can be singled out by letting the size of the gene pool grow large; second, a sex-allocation game for diploids and haplo-diploids, which suggests that the equilibrium sex ratio follows a Beta distribution with parameters depending on the features of the genetic system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purposes of this study were to characterize the performance of a 3-dimensional (3D) ordered-subset expectation maximization (OSEM) algorithm in the quantification of left ventricular (LV) function with (99m)Tc-labeled agent gated SPECT (G-SPECT), the QGS program, and a beating-heart phantom and to optimize the reconstruction parameters for clinical applications. METHODS: A G-SPECT image of a dynamic heart phantom simulating the beating left ventricle was acquired. The exact volumes of the phantom were known and were as follows: end-diastolic volume (EDV) of 112 mL, end-systolic volume (ESV) of 37 mL, and stroke volume (SV) of 75 mL; these volumes produced an LV ejection fraction (LVEF) of 67%. Tomographic reconstructions were obtained after 10-20 iterations (I) with 4, 8, and 16 subsets (S) at full width at half maximum (FWHM) gaussian postprocessing filter cutoff values of 8-15 mm. The QGS program was used for quantitative measurements. RESULTS: Measured values ranged from 72 to 92 mL for EDV, from 18 to 32 mL for ESV, and from 54 to 63 mL for SV, and the calculated LVEF ranged from 65% to 76%. Overall, the combination of 10 I, 8 S, and a cutoff filter value of 10 mm produced the most accurate results. The plot of the measures with respect to the expectation maximization-equivalent iterations (I x S product) revealed a bell-shaped curve for the LV volumes and a reverse distribution for the LVEF, with the best results in the intermediate range. In particular, FWHM cutoff values exceeding 10 mm affected the estimation of the LV volumes. CONCLUSION: The QGS program is able to correctly calculate the LVEF when used in association with an optimized 3D OSEM algorithm (8 S, 10 I, and FWHM of 10 mm) but underestimates the LV volumes. However, various combinations of technical parameters, including a limited range of I and S (80-160 expectation maximization-equivalent iterations) and low cutoff values (< or =10 mm) for the gaussian postprocessing filter, produced results with similar accuracies and without clinically relevant differences in the LV volumes and the estimated LVEF.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: We aimed to analyze the rate and time distribution of pre- and post-morbid cerebrovascular events in a single ischemic stroke population, and whether these depend on the etiology of the index stroke. Methods: In 2,203 consecutive patients admitted to a single stroke center registry (ASTRAL), the ischemic stroke that led to admission was considered the index event. Frequency distribution and cumulative relative distribution graphs of the most recent and first recurrent event (ischemic stroke, transient ischemic attack, intracranial or subarachnoid hemorrhage) were drawn in weekly and daily intervals for all strokes and for all stroke types. Results: The frequency of events at identical time points before and after the index stroke was mostly reduced in the first week after (vs. before) stroke (1.0 vs. 4.2%, p < 0.001) and the first month (2.7 vs. 7.4%, p < 0.001), and then ebbed over the first year (8.4 vs. 13.1%, p < 0.001). On daily basis, the peak frequency was noticed at day -1 (1.6%) with a reduction to 0.7% on the index day and 0.17% 24 h after. The event rate in patients with atherosclerotic stroke was particularly high around the index event, but 1-year cumulative recurrence rate was similar in all stroke types. Conclusions: We confirm a short window of increased vulnerability in ischemic stroke and show a 4-, 3- and 2-fold reduction in post-stroke events at 1 week, 1 month and 1 year, respectively, compared to identical pre-stroke periods. This break in the 'stroke wave' is particularly striking after atherosclerotic and lacunar strokes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A simple method is presented to evaluate the effects of short-range correlations on the momentum distribution of nucleons in nuclear matter within the framework of the Greens function approach. The method provides a very efficient representation of the single-particle Greens function for a correlated system. The reliability of this method is established by comparing its results to those obtained in more elaborate calculations. The sensitivity of the momentum distribution on the nucleon-nucleon interaction and the nuclear density is studied. The momentum distributions of nucleons in finite nuclei are derived from those in nuclear matter using a local-density approximation. These results are compared to those obtained directly for light nuclei like 16O.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The influence of hole-hole (h-h) propagation in addition to the conventional particle-particle (p-p) propagation, on the energy per particle and the momentum distribution is investigated for the v2 central interaction which is derived from Reid¿s soft-core potential. The results are compared to Brueckner-Hartree-Fock calculations with a continuous choice for the single-particle (SP) spectrum. Calculation of the energy from a self-consistently determined SP spectrum leads to a lower saturation density. This result is not corroborated by calculating the energy from the hole spectral function, which is, however, not self-consistent. A generalization of previous calculations of the momentum distribution, based on a Goldstone diagram expansion, is introduced that allows the inclusion of h-h contributions to all orders. From this result an alternative calculation of the kinetic energy is obtained. In addition, a direct calculation of the potential energy is presented which is obtained from a solution of the ladder equation containing p-p and h-h propagation to all orders. These results can be considered as the contributions of selected Goldstone diagrams (including p-p and h-h terms on the same footing) to the kinetic and potential energy in which the SP energy is given by the quasiparticle energy. The results for the summation of Goldstone diagrams leads to a different momentum distribution than the one obtained from integrating the hole spectral function which in general gives less depletion of the Fermi sea. Various arguments, based partly on the results that are obtained, are put forward that a self-consistent determination of the spectral functions including the p-p and h-h ladder contributions (using a realistic interaction) will shed light on the question of nuclear saturation at a nonrelativistic level that is consistent with the observed depletion of SP orbitals in finite nuclei.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Résumé La cryptographie classique est basée sur des concepts mathématiques dont la sécurité dépend de la complexité du calcul de l'inverse des fonctions. Ce type de chiffrement est à la merci de la puissance de calcul des ordinateurs ainsi que la découverte d'algorithme permettant le calcul des inverses de certaines fonctions mathématiques en un temps «raisonnable ». L'utilisation d'un procédé dont la sécurité est scientifiquement prouvée s'avère donc indispensable surtout les échanges critiques (systèmes bancaires, gouvernements,...). La cryptographie quantique répond à ce besoin. En effet, sa sécurité est basée sur des lois de la physique quantique lui assurant un fonctionnement inconditionnellement sécurisé. Toutefois, l'application et l'intégration de la cryptographie quantique sont un souci pour les développeurs de ce type de solution. Cette thèse justifie la nécessité de l'utilisation de la cryptographie quantique. Elle montre que le coût engendré par le déploiement de cette solution est justifié. Elle propose un mécanisme simple et réalisable d'intégration de la cryptographie quantique dans des protocoles de communication largement utilisés comme les protocoles PPP, IPSec et le protocole 802.1li. Des scénarios d'application illustrent la faisabilité de ces solutions. Une méthodologie d'évaluation, selon les critères communs, des solutions basées sur la cryptographie quantique est également proposée dans ce document. Abstract Classical cryptography is based on mathematical functions. The robustness of a cryptosystem essentially depends on the difficulty of computing the inverse of its one-way function. There is no mathematical proof that establishes whether it is impossible to find the inverse of a given one-way function. Therefore, it is mandatory to use a cryptosystem whose security is scientifically proven (especially for banking, governments, etc.). On the other hand, the security of quantum cryptography can be formally demonstrated. In fact, its security is based on the laws of physics that assure the unconditional security. How is it possible to use and integrate quantum cryptography into existing solutions? This thesis proposes a method to integrate quantum cryptography into existing communication protocols like PPP, IPSec and the 802.l1i protocol. It sketches out some possible scenarios in order to prove the feasibility and to estimate the cost of such scenarios. Directives and checkpoints are given to help in certifying quantum cryptography solutions according to Common Criteria.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The particle orientation in several Y-Fe2O3magnetic tapes has been quantitatively evaluated by using the data of both Mössbauer and hysteresis loop measurements performed in the three orthogonal directions. A texture function has been obtained as a development of real harmonics. The profile of the texture function gives the quality of the different magnetic tapes. A different degree of particle orientation at the surface of the tape is evidenced by means of conversion electron Mössbauer spectra.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aimed to use the plantar pressure insole for estimating the three-dimensional ground reaction force (GRF) as well as the frictional torque (T(F)) during walking. Eleven subjects, six healthy and five patients with ankle disease participated in the study while wearing pressure insoles during several walking trials on a force-plate. The plantar pressure distribution was analyzed and 10 principal components of 24 regional pressure values with the stance time percentage (STP) were considered for GRF and T(F) estimation. Both linear and non-linear approximators were used for estimating the GRF and T(F) based on two learning strategies using intra-subject and inter-subjects data. The RMS error and the correlation coefficient between the approximators and the actual patterns obtained from force-plate were calculated. Our results showed better performance for non-linear approximation especially when the STP was considered as input. The least errors were observed for vertical force (4%) and anterior-posterior force (7.3%), while the medial-lateral force (11.3%) and frictional torque (14.7%) had higher errors. The result obtained for the patients showed higher error; nevertheless, when the data of the same patient were used for learning, the results were improved and in general slight differences with healthy subjects were observed. In conclusion, this study showed that ambulatory pressure insole with data normalization, an optimal choice of inputs and a well-trained nonlinear mapping function can estimate efficiently the three-dimensional ground reaction force and frictional torque in consecutive gait cycle without requiring a force-plate.