990 resultados para COMPLEMENTARY EXPONENTIAL GEOMETRIC DISTRIBUTION
Resumo:
We present a technique for online compression of ECG signals using the Golomb-Rice encoding algorithm. This is facilitated by a novel time encoding asynchronous analog-to-digital converter targeted for low-power, implantable, long-term bio-medical sensing applications. In contrast to capturing the actual signal (voltage) values the asynchronous time encoder captures and encodes the time information at which predefined changes occur in the signal thereby minimizing the sensor's energy use and the number of bits we store to represent the information by not capturing unnecessary samples. The time encoder transforms the ECG signal data to pure time information that has a geometric distribution such that the Golomb-Rice encoding algorithm can be used to further compress the data. An overall online compression rate of about 6 times is achievable without the usual computations associated with most compression methods.
Resumo:
Proper maintenance of plant items is crucial for the safe and profitable operation of process plants, The relevant maintenance policies fall into the following four categories: (i) preventivejopportunistic/breakdown replacement policies, (ii) inspection/inspection-repair-replacernent policies, (iii) restorative maintenance policies, and (iv) condition based maintenance policies, For correlating failure times of component equipnent and complete systems, the Weibull failure distribution has been used, A new powerful method, SEQLIM, has been proposed for the estimation of the Weibull parameters; particularly, when maintenance records contain very few failures and many successful operation times. When a system consists of a number of replaceable, ageing components, an opporturistic replacernent policy has been found to be cost-effective, A simple opportunistic rrodel has been developed. Inspection models with various objective functions have been investigated, It was found that, on the assumption of a negative exponential failure distribution, all models converge to the same optimal inspection interval; provided the safety components are very reliable and the demand rate is low, When deterioration becomes a contributory factor to same failures, periodic inspections, calculated from above models, are too frequent, A case of safety trip systems has been studied, A highly effective restorative maintenance policy can be developed if the performance of the equipment under this category can be related to some predictive modelling. A novel fouling model has been proposed to determine cleaning strategies of condensers, Condition-based maintenance policies have been investigated. A simple gauge has been designed for condition monitoring of relief valve springs. A typical case of an exothermic inert gas generation plant has been studied, to demonstrate how various policies can be applied to devise overall maintenance actions.
Resumo:
2000 Mathematics Subject Classification: 60J80, 62P05.
Resumo:
Since the Morris worm was released in 1988, Internet worms continue to be one of top security threats. For example, the Conficker worm infected 9 to 15 million machines in early 2009 and shut down the service of some critical government and medical networks. Moreover, it constructed a massive peer-to-peer (P2P) botnet. Botnets are zombie networks controlled by attackers setting out coordinated attacks. In recent years, botnets have become the number one threat to the Internet. The objective of this research is to characterize spatial-temporal infection structures of Internet worms, and apply the observations to study P2P-based botnets formed by worm infection. First, we infer temporal characteristics of the Internet worm infection structure, i.e., the host infection time and the worm infection sequence, and thus pinpoint patient zero or initially infected hosts. Specifically, we apply statistical estimation techniques on Darknet observations. We show analytically and empirically that our proposed estimators can significantly improve the inference accuracy. Second, we reveal two key spatial characteristics of the Internet worm infection structure, i.e., the number of children and the generation of the underlying tree topology formed by worm infection. Specifically, we apply probabilistic modeling methods and a sequential growth model. We show analytically and empirically that the number of children has asymptotically a geometric distribution with parameter 0.5, and the generation follows closely a Poisson distribution. Finally, we evaluate bot detection strategies and effects of user defenses in P2P-based botnets formed by worm infection. Specifically, we apply the observations of the number of children and demonstrate analytically and empirically that targeted detection that focuses on the nodes with the largest number of children is an efficient way to expose bots. However, we also point out that future botnets may self-stop scanning to weaken targeted detection, without greatly slowing down the speed of worm infection. We then extend the worm spatial infection structure and show empirically that user defenses, e.g. , patching or cleaning, can significantly mitigate the robustness and the effectiveness of P2P-based botnets. To counterattack, we evaluate a simple measure by future botnets that enhances topology robustness through worm re-infection.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do grau de Mestre em Engenharia Biomédica. A presente dissertação foi desenvolvida no Erasmus Medical Center em Roterdão, Holanda
Resumo:
PURPOSE: To evaluate 2 left ventricular mass index (LVMI) normality criteria for the prevalence of left ventricular geometric patterns in a hypertensive population ( HT ) . METHODS: 544 essential hypertensive patients, were evaluated by echocardiography, and different left ventricular hypertrophy criteria were applied: 1 - classic : men - 134 g/m² and women - 110 g/m² ; 2- obtained from the 95th percentil of LVMI from a normotensive population (NT). RESULTS: The prevalence of 4 left ventricular geometric patterns, respectively for criteria 1 and 2, were: normal geometry - 47.7% and 39.3%; concentric remodelying - 25.4% and 14.3%; concentric hypertrophy - 18.4% and 27.7% and excentric hypertrophy - 8.8% and 16.7%, which confered abnormal geometry to 52.6% and 60.7% of hypertensive. The comparative analysis between NT and normal geometry hypertensive group according to criteria 1, detected significative stuctural differences,"( *p < 0.05):LVMI- 78.4 ± 1.50 vs 85.9 ±0.95 g/m² *; posterior wall thickness -8.5 ± 0.1 vs 8.9 ± 0.05 mm*; left atrium - 33.3 ± 0.41 vs 34.7 ± 0.30 mm *. With criteria 2, significative structural differences between the 2 groups were not observed. CONCLUSION: The use of a reference population based criteria, increased the abnormal left ventricular geometry prevalence in hypertensive patients and seemed more appropriate for left ventricular hypertrophy detection and risk stratification.
Resumo:
The generalized exponential distribution, proposed by Gupta and Kundu (1999), is a good alternative to standard lifetime distributions as exponential, Weibull or gamma. Several authors have considered the problem of Bayesian estimation of the parameters of generalized exponential distribution, assuming independent gamma priors and other informative priors. In this paper, we consider a Bayesian analysis of the generalized exponential distribution by assuming the conventional non-informative prior distributions, as Jeffreys and reference prior, to estimate the parameters. These priors are compared with independent gamma priors for both parameters. The comparison is carried out by examining the frequentist coverage probabilities of Bayesian credible intervals. We shown that maximal data information prior implies in an improper posterior distribution for the parameters of a generalized exponential distribution. It is also shown that the choice of a parameter of interest is very important for the reference prior. The different choices lead to different reference priors in this case. Numerical inference is illustrated for the parameters by considering data set of different sizes and using MCMC (Markov Chain Monte Carlo) methods.
Resumo:
Abstract Background Delignification pretreatments of biomass and methods to assess their efficacy are crucial for biomass-to-biofuels research and technology. Here, we applied confocal and fluorescence lifetime imaging microscopy (FLIM) using one- and two-photon excitation to map the lignin distribution within bagasse fibers pretreated with acid and alkali. The evaluated spectra and decay times are correlated with previously calculated lignin fractions. We have also investigated the influence of the pretreatment on the lignin distribution in the cell wall by analyzing the changes in the fluorescence characteristics using two-photon excitation. Eucalyptus fibers were also analyzed for comparison. Results Fluorescence spectra and variations of the decay time correlate well with the delignification yield and the lignin distribution. The decay dependences are considered two-exponential, one with a rapid (τ1) and the other with a slow (τ2) decay time. The fastest decay is associated to concentrated lignin in the bagasse and has a low sensitivity to the treatment. The fluorescence decay time became longer with the increase of the alkali concentration used in the treatment, which corresponds to lignin emission in a less concentrated environment. In addition, the two-photon fluorescence spectrum is very sensitive to lignin content and accumulation in the cell wall, broadening with the acid pretreatment and narrowing with the alkali one. Heterogeneity of the pretreated cell wall was observed. Conclusions Our results reveal lignin domains with different concentration levels. The acid pretreatment caused a disorder in the arrangement of lignin and its accumulation in the external border of the cell wall. The alkali pretreatment efficiently removed lignin from the middle of the bagasse fibers, but was less effective in its removal from their surfaces. Our results evidenced a strong correlation between the decay times of the lignin fluorescence and its distribution within the cell wall. A new variety of lignin fluorescence states were accessed by two-photon excitation, which allowed an even broader, but complementary, optical characterization of lignocellulosic materials. These results suggest that the lignin arrangement in untreated bagasse fiber is based on a well-organized nanoenvironment that favors a very low level of interaction between the molecules.
Resumo:
Sizes and power of selected two-sample tests of the equality of survival distributions are compared by simulation for small samples from unequally, randomly-censored exponential distributions. The tests investigated include parametric tests (F, Score, Likelihood, Asymptotic), logrank tests (Mantel, Peto-Peto), and Wilcoxon-Type tests (Gehan, Prentice). Equal sized samples, n = 18, 16, 32 with 1000 (size) and 500 (power) simulation trials, are compared for 16 combinations of the censoring proportions 0%, 20%, 40%, and 60%. For n = 8 and 16, the Asymptotic, Peto-Peto, and Wilcoxon tests perform at nominal 5% size expectations, but the F, Score and Mantel tests exceeded 5% size confidence limits for 1/3 of the censoring combinations. For n = 32, all tests showed proper size, with the Peto-Peto test most conservative in the presence of unequal censoring. Powers of all tests are compared for exponential hazard ratios of 1.4 and 2.0. There is little difference in power characteristics of the tests within the classes of tests considered. The Mantel test showed 90% to 95% power efficiency relative to parametric tests. Wilcoxon-type tests have the lowest relative power but are robust to differential censoring patterns. A modified Peto-Peto test shows power comparable to the Mantel test. For n = 32, a specific Weibull-exponential comparison of crossing survival curves suggests that the relative powers of logrank and Wilcoxon-type tests are dependent on the scale parameter of the Weibull distribution. Wilcoxon-type tests appear more powerful than logrank tests in the case of late-crossing and less powerful for early-crossing survival curves. Guidelines for the appropriate selection of two-sample tests are given. ^
Resumo:
Motivated by the observation of spiral patterns in a wide range of physical, chemical, and biological systems, we present an automated approach that aims at characterizing quantitatively spiral-like elements in complex stripelike patterns. The approach provides the location of the spiral tip and the size of the spiral arms in terms of their arc length and their winding number. In addition, it yields the number of pattern components (Betti number of order 1), as well as their size and certain aspects of their shape. We apply the method to spiral defect chaos in thermally driven Rayleigh- Bénard convection and find that the arc length of spirals decreases monotonically with decreasing Prandtl number of the fluid and increasing heating. By contrast, the winding number of the spirals is nonmonotonic in the heating. The distribution function for the number of spirals is significantly narrower than a Poisson distribution. The distribution function for the winding number shows approximately an exponential decay. It depends only weakly on the heating, but strongly on the Prandtl number. Large spirals arise only for larger Prandtl numbers. In this regime the joint distribution for the spiral length and the winding number exhibits a three-peak structure, indicating the dominance of Archimedean spirals of opposite sign and relatively straight sections. For small Prandtl numbers the distribution function reveals a large number of small compact pattern components.
Resumo:
An experimental method for characterizing the time-resolved phase noise of a fast switching tunable laser is discussed. The method experimentally determines a complementary cumulative distribution function of the laser's differential phase as a function of time after a switching event. A time resolved bit error rate of differential quadrature phase shift keying formatted data, calculated using the phase noise measurements, was fitted to an experimental time-resolved bit error rate measurement using a field programmable gate array, finding a good agreement between the time-resolved bit error rates.
Resumo:
2000 Mathematics Subject Classification: 62G30, 62E10.