833 resultados para Robustness
Resumo:
BACKGROUND: PCR has the potential to detect and precisely quantify specific DNA sequences, but it is not yet often used as a fully quantitative method. A number of data collection and processing strategies have been described for the implementation of quantitative PCR. However, they can be experimentally cumbersome, their relative performances have not been evaluated systematically, and they often remain poorly validated statistically and/or experimentally. In this study, we evaluated the performance of known methods, and compared them with newly developed data processing strategies in terms of resolution, precision and robustness. RESULTS: Our results indicate that simple methods that do not rely on the estimation of the efficiency of the PCR amplification may provide reproducible and sensitive data, but that they do not quantify DNA with precision. Other evaluated methods based on sigmoidal or exponential curve fitting were generally of both poor resolution and precision. A statistical analysis of the parameters that influence efficiency indicated that it depends mostly on the selected amplicon and to a lesser extent on the particular biological sample analyzed. Thus, we devised various strategies based on individual or averaged efficiency values, which were used to assess the regulated expression of several genes in response to a growth factor. CONCLUSION: Overall, qPCR data analysis methods differ significantly in their performance, and this analysis identifies methods that provide DNA quantification estimates of high precision, robustness and reliability. These methods allow reliable estimations of relative expression ratio of two-fold or higher, and our analysis provides an estimation of the number of biological samples that have to be analyzed to achieve a given precision.
Resumo:
Comparative analyses of survival senescence by using life tables have identified generalizations including the observation that mammals senesce faster than similar-sized birds. These generalizations have been challenged because of limitations of life-table approaches and the growing appreciation that senescence is more than an increasing probability of death. Without using life tables, we examine senescence rates in annual individual fitness using 20 individual-based data sets of terrestrial vertebrates with contrasting life histories and body size. We find that senescence is widespread in the wild and equally likely to occur in survival and reproduction. Additionally, mammals senesce faster than birds because they have a faster life history for a given body size. By allowing us to disentangle the effects of two major fitness components our methods allow an assessment of the robustness of the prevalent life-table approach. Focusing on one aspect of life history - survival or recruitment - can provide reliable information on overall senescence.
Resumo:
Topological order has proven a useful concept to describe quantum phase transitions which are not captured by the Ginzburg-Landau type of symmetry-breaking order. However, lacking a local order parameter, topological order is hard to detect. One way to detect it is via direct observation of anyonic properties of excitations which are usually discussed in the thermodynamic limit, but so far has not been realized in macroscopic quantum Hall samples. Here we consider a system of few interacting bosons subjected to the lowest Landau level by a gauge potential, and theoretically investigate vortex excitations in order to identify topological properties of different ground states. Our investigation demonstrates that even in surprisingly small systems anyonic properties are able to characterize the topological order. In addition, focusing on a system in the Laughlin state, we study the robustness of its anyonic behavior in the presence of tunable finite-range interactions acting as a perturbation. A clear signal of a transition to a different state is reflected by the system's anyonic properties.
Resumo:
Background: In longitudinal studies where subjects experience recurrent incidents over a period of time, such as respiratory infections, fever or diarrhea, statistical methods are required to take into account the within-subject correlation. Methods: For repeated events data with censored failure, the independent increment (AG), marginal (WLW) and conditional (PWP) models are three multiple failure models that generalize Cox"s proportional hazard model. In this paper, we revise the efficiency, accuracy and robustness of all three models under simulated scenarios with varying degrees of within-subject correlation, censoring levels, maximum number of possible recurrences and sample size. We also study the methods performance on a real dataset from a cohort study with bronchial obstruction. Results: We find substantial differences between methods and there is not an optimal method. AG and PWP seem to be preferable to WLW for low correlation levels but the situation reverts for high correlations. Conclusions: All methods are stable in front of censoring, worsen with increasing recurrence levels and share a bias problem which, among other consequences, makes asymptotic normal confidence intervals not fully reliable, although they are well developed theoretically.
Resumo:
This paper presents a validation study on statistical nonsupervised brain tissue classification techniques in magnetic resonance (MR) images. Several image models assuming different hypotheses regarding the intensity distribution model, the spatial model and the number of classes are assessed. The methods are tested on simulated data for which the classification ground truth is known. Different noise and intensity nonuniformities are added to simulate real imaging conditions. No enhancement of the image quality is considered either before or during the classification process. This way, the accuracy of the methods and their robustness against image artifacts are tested. Classification is also performed on real data where a quantitative validation compares the methods' results with an estimated ground truth from manual segmentations by experts. Validity of the various classification methods in the labeling of the image as well as in the tissue volume is estimated with different local and global measures. Results demonstrate that methods relying on both intensity and spatial information are more robust to noise and field inhomogeneities. We also demonstrate that partial volume is not perfectly modeled, even though methods that account for mixture classes outperform methods that only consider pure Gaussian classes. Finally, we show that simulated data results can also be extended to real data.
Resumo:
Many eukaryote organisms are polyploid. However, despite their importance, evolutionary inference of polyploid origins and modes of inheritance has been limited by a need for analyses of allele segregation at multiple loci using crosses. The increasing availability of sequence data for nonmodel species now allows the application of established approaches for the analysis of genomic data in polyploids. Here, we ask whether approximate Bayesian computation (ABC), applied to realistic traditional and next-generation sequence data, allows correct inference of the evolutionary and demographic history of polyploids. Using simulations, we evaluate the robustness of evolutionary inference by ABC for tetraploid species as a function of the number of individuals and loci sampled, and the presence or absence of an outgroup. We find that ABC adequately retrieves the recent evolutionary history of polyploid species on the basis of both old and new sequencing technologies. The application of ABC to sequence data from diploid and polyploid species of the plant genus Capsella confirms its utility. Our analysis strongly supports an allopolyploid origin of C. bursa-pastoris about 80 000 years ago. This conclusion runs contrary to previous findings based on the same data set but using an alternative approach and is in agreement with recent findings based on whole-genome sequencing. Our results indicate that ABC is a promising and powerful method for revealing the evolution of polyploid species, without the need to attribute alleles to a homeologous chromosome pair. The approach can readily be extended to more complex scenarios involving higher ploidy levels.
Resumo:
This paper proposes a novel high capacity robust audio watermarking algorithm by using the high frequency band of the wavelet decomposition at which the human auditory system (HAS) is not very sensitive to alteration. The main idea is to divide the high frequency band into frames and, for embedding, to change the wavelet samples depending on the average of relevant frame¿s samples. The experimental results show that the method has a very high capacity (about 11,000 bps), without significant perceptual distortion (ODG in [¿1 ,0] and SNR about 30dB), and provides robustness against common audio signal processing such as additive noise, filtering, echo and MPEG compression (MP3).
Resumo:
Many audio watermarking schemes divide the audio signal into several blocks such that part of the watermark is embedded into each of them. One of the key issues in these block-oriented watermarking schemes is to preserve the synchronisation, i.e. to recover the exact position of each block in the mark recovery process. In this paper, a novel time domain synchronisation technique is presented together with a new blind watermarking scheme which works in the Discrete Fourier Transform (DFT or FFT) domain. The combined scheme provides excellent imperceptibility results whilst achieving robustness against typical attacks. Furthermore, the execution of the scheme is fast enough to be used in real-time applications. The excellent transparency of the embedding algorithm makes it particularly useful for professional applications, such as the embedding of monitoring information in broadcast signals. The scheme is also compared with some recent results of the literature.
Resumo:
Cognitive radio is a wireless technology aimed at improvingthe efficiency use of the radio-electric spectrum, thus facilitating a reductionin the load on the free frequency bands. Cognitive radio networkscan scan the spectrum and adapt their parameters to operate in the unoccupiedbands. To avoid interfering with licensed users operating on a givenchannel, the networks need to be highly sensitive, which is achieved byusing cooperative sensing methods. Current cooperative sensing methodsare not robust enough against occasional or continuous attacks. This articleoutlines a Group Fusion method that takes into account the behavior ofusers over the short and long term. On fusing the data, the method is basedon giving more weight to user groups that are more unanimous in their decisions.Simulations have been performed in a dynamic environment withinterferences. Results prove that when attackers are present (both reiterativeor sporadic), the proposed Group Fusion method has superior sensingcapability than other methods.
Resumo:
Peer-reviewed
Resumo:
Peer-reviewed
Resumo:
This paper describes an audio watermarking scheme based on lossy compression. The main idea is taken from an image watermarking approach where the JPEG compression algorithm is used to determine where and how the mark should be placed. Similarly, in the audio scheme suggested in this paper, an MPEG 1 Layer 3 algorithm is chosen for compression to determine the position of the mark bits and, thus, the psychoacoustic masking of the MPEG 1 Layer 3compression is implicitly used. This methodology provides with a high robustness degree against compression attacks. The suggested scheme is also shown to succeed against most of the StirMark benchmark attacks for audio.
Resumo:
La radio cognitiva es una tecnología inalámbrica propuesta para usar eficientemente los recursos del espectro radioeléctrico permitiendo así reducir la carga existente en las bandas de frecuencia de uso libre.Las redes de radio cognitiva son capaces de escanear el espectro y adaptar sus parámetros para operar en las bandas no ocupadas. Para evitar interferir con usuarios con licencia que operan en un determinado canal, la sensibilidad de las redes tiene que ser muy alta. Ello se consigue con métodos de detección cooperativos. Los métodos de detección cooperativa actuales tienen una carencia de robustez ya sea frente a ataques puntuales o continuos.En este artículo presentamos un método de fusión por grupos que tiene presente el comportamiento de los usuarios a corto y largo plazo. Al realizar la fusión de los datos, el método se basa en dar mayor peso a los grupos de usuarios con mayor unanimidad en sus decisiones.Los resultados de las simulaciones prueban que en presencia de atacantes el método de fusión por grupos propuesto consigue una detección superior a otros métodos, cumpliendo los requisitos de sensibilidad mínimos de las redes de radio cognitiva incluso con un 12 de usuarios reiteradamente maliciosos o un 10 de atacantes puntuales.
Resumo:
Abstract One of the most important issues in molecular biology is to understand regulatory mechanisms that control gene expression. Gene expression is often regulated by proteins, called transcription factors which bind to short (5 to 20 base pairs),degenerate segments of DNA. Experimental efforts towards understanding the sequence specificity of transcription factors is laborious and expensive, but can be substantially accelerated with the use of computational predictions. This thesis describes the use of algorithms and resources for transcriptionfactor binding site analysis in addressing quantitative modelling, where probabilitic models are built to represent binding properties of a transcription factor and can be used to find new functional binding sites in genomes. Initially, an open-access database(HTPSELEX) was created, holding high quality binding sequences for two eukaryotic families of transcription factors namely CTF/NF1 and LEFT/TCF. The binding sequences were elucidated using a recently described experimental procedure called HTP-SELEX, that allows generation of large number (> 1000) of binding sites using mass sequencing technology. For each HTP-SELEX experiments we also provide accurate primary experimental information about the protein material used, details of the wet lab protocol, an archive of sequencing trace files, and assembled clone sequences of binding sequences. The database also offers reasonably large SELEX libraries obtained with conventional low-throughput protocols.The database is available at http://wwwisrec.isb-sib.ch/htpselex/ and and ftp://ftp.isrec.isb-sib.ch/pub/databases/htpselex. The Expectation-Maximisation(EM) algorithm is one the frequently used methods to estimate probabilistic models to represent the sequence specificity of transcription factors. We present computer simulations in order to estimate the precision of EM estimated models as a function of data set parameters(like length of initial sequences, number of initial sequences, percentage of nonbinding sequences). We observed a remarkable robustness of the EM algorithm with regard to length of training sequences and the degree of contamination. The HTPSELEX database and the benchmarked results of the EM algorithm formed part of the foundation for the subsequent project, where a statistical framework called hidden Markov model has been developed to represent sequence specificity of the transcription factors CTF/NF1 and LEF1/TCF using the HTP-SELEX experiment data. The hidden Markov model framework is capable of both predicting and classifying CTF/NF1 and LEF1/TCF binding sites. A covariance analysis of the binding sites revealed non-independent base preferences at different nucleotide positions, providing insight into the binding mechanism. We next tested the LEF1/TCF model by computing binding scores for a set of LEF1/TCF binding sequences for which relative affinities were determined experimentally using non-linear regression. The predicted and experimentally determined binding affinities were in good correlation.
Resumo:
The application of adaptive antenna techniques to fixed-architecture base stations has been shown to offer wide-ranging benefits, including interference rejection capabilities or increased coverage and spectral efficiency.Unfortunately, the actual implementation ofthese techniques to mobile communication scenarios has traditionally been set back by two fundamental reasons. On one hand, the lack of flexibility of current transceiver architectures does not allow for the introduction of advanced add-on functionalities. On the other hand, theoften oversimplified models for the spatiotemporal characteristics of the radio communications channel generally give rise toperformance predictions that are, in practice, too optimistic. The advent of software radio architectures represents a big step toward theintroduction of advanced receive/transmitcapabilities. Thanks to their inherent flexibilityand robustness, software radio architecturesare the appropriate enabling technology for theimplementation of array processing techniques.Moreover, given the exponential progression ofcommunication standards in coexistence andtheir constant evolution, software reconfigurabilitywill probably soon become the only costefficientalternative for the transceiverupgrade. This article analyzes the requirementsfor the introduction of software radio techniquesand array processing architectures inmultistandard scenarios. It basically summarizesthe conclusions and results obtained withinthe ACTS project SUNBEAM,1 proposingalgorithms and analyzing the feasibility ofimplementation of innovative and softwarereconfigurablearray processing architectures inmultistandard settings.