833 resultados para Robustness
Resumo:
This paper presents a complete solution for creating accurate 3D textured models from monocular video sequences. The methods are developed within the framework of sequential structure from motion, where a 3D model of the environment is maintained and updated as new visual information becomes available. The camera position is recovered by directly associating the 3D scene model with local image observations. Compared to standard structure from motion techniques, this approach decreases the error accumulation while increasing the robustness to scene occlusions and feature association failures. The obtained 3D information is used to generate high quality, composite visual maps of the scene (mosaics). The visual maps are used to create texture-mapped, realistic views of the scene
Resumo:
The automatic interpretation of conventional traffic signs is very complex and time consuming. The paper concerns an automatic warning system for driving assistance. It does not interpret the standard traffic signs on the roadside; the proposal is to incorporate into the existing signs another type of traffic sign whose information will be more easily interpreted by a processor. The type of information to be added is profuse and therefore the most important object is the robustness of the system. The basic proposal of this new philosophy is that the co-pilot system for automatic warning and driving assistance can interpret with greater ease the information contained in the new sign, whilst the human driver only has to interpret the "classic" sign. One of the codings that has been tested with good results and which seems to us easy to implement is that which has a rectangular shape and 4 vertical bars of different colours. The size of these signs is equivalent to the size of the conventional signs (approximately 0.4 m2). The colour information from the sign can be easily interpreted by the proposed processor and the interpretation is much easier and quicker than the information shown by the pictographs of the classic signs
Resumo:
This paper focuses on the problem of realizing a plane-to-plane virtual link between a camera attached to the end-effector of a robot and a planar object. In order to do the system independent to the object surface appearance, a structured light emitter is linked to the camera so that 4 laser pointers are projected onto the object. In a previous paper we showed that such a system has good performance and nice characteristics like partial decoupling near the desired state and robustness against misalignment of the emitter and the camera (J. Pages et al., 2004). However, no analytical results concerning the global asymptotic stability of the system were obtained due to the high complexity of the visual features utilized. In this work we present a better set of visual features which improves the properties of the features in (J. Pages et al., 2004) and for which it is possible to prove the global asymptotic stability
Resumo:
Schistosomiasis mansoni is not just a physical disease, but is related to social and behavioural factors as well. Snails of the Biomphalaria genus are an intermediate host for Schistosoma mansoni and infect humans through water. The objective of this study is to classify the risk of schistosomiasis in the state of Minas Gerais (MG). We focus on socioeconomic and demographic features, basic sanitation features, the presence of accumulated water bodies, dense vegetation in the summer and winter seasons and related terrain characteristics. We draw on the decision tree approach to infection risk modelling and mapping. The model robustness was properly verified. The main variables that were selected by the procedure included the terrain's water accumulation capacity, temperature extremes and the Human Development Index. In addition, the model was used to generate two maps, one that included risk classification for the entire of MG and another that included classification errors. The resulting map was 62.9% accurate.
Resumo:
The R-package “compositions”is a tool for advanced compositional analysis. Its basicfunctionality has seen some conceptual improvement, containing now some facilitiesto work with and represent ilr bases built from balances, and an elaborated subsys-tem for dealing with several kinds of irregular data: (rounded or structural) zeroes,incomplete observations and outliers. The general approach to these irregularities isbased on subcompositions: for an irregular datum, one can distinguish a “regular” sub-composition (where all parts are actually observed and the datum behaves typically)and a “problematic” subcomposition (with those unobserved, zero or rounded parts, orelse where the datum shows an erratic or atypical behaviour). Systematic classificationschemes are proposed for both outliers and missing values (including zeros) focusing onthe nature of irregularities in the datum subcomposition(s).To compute statistics with values missing at random and structural zeros, a projectionapproach is implemented: a given datum contributes to the estimation of the desiredparameters only on the subcompositon where it was observed. For data sets withvalues below the detection limit, two different approaches are provided: the well-knownimputation technique, and also the projection approach.To compute statistics in the presence of outliers, robust statistics are adapted to thecharacteristics of compositional data, based on the minimum covariance determinantapproach. The outlier classification is based on four different models of outlier occur-rence and Monte-Carlo-based tests for their characterization. Furthermore the packageprovides special plots helping to understand the nature of outliers in the dataset.Keywords: coda-dendrogram, lost values, MAR, missing data, MCD estimator,robustness, rounded zeros
Resumo:
In this paper, robustness of parametric systems is analyzed using a new approach to interval mathematics called Modal Interval Analysis. Modal Intervals are an interval extension that, instead of classic intervals, recovers some of the properties required by a numerical system. Modal Interval Analysis not only simplifies the computation of interval functions but allows semantic interpretation of their results. Necessary, sufficient and, in some cases, necessary and sufficient conditions for robust performance are presented
Resumo:
Colorectal cancer is one of the most prevalent cancers in developed countries. However, the genetic factors influencing its appearance remain far from being fully characterized. Recently, a G>A functional transition mapping the 3' untranslated region of the CXCL12 gene (rs1801157) has been found to be under-represented among rectal cancer patients when compared to colon cancer patients from a Swedish series. Here we present the results from an independent analysis of CXCL12 rs1801157 in a larger CRC series of Spanish origin in order to analyse the robustness of this association within a different European population. No significant difference was observed between controls and colon or rectal cancer patients. We were also unable to find a correlation between rs1801157 and different prognostic markers such as metastasis development or disease-free survival time. The epidemiologic data involving CXCL12 rs1801157 in colorectal cancer risk are discussed.
Resumo:
A condition needed for testing nested hypotheses from a Bayesianviewpoint is that the prior for the alternative model concentratesmass around the small, or null, model. For testing independencein contingency tables, the intrinsic priors satisfy this requirement.Further, the degree of concentration of the priors is controlled bya discrete parameter m, the training sample size, which plays animportant role in the resulting answer regardless of the samplesize.In this paper we study robustness of the tests of independencein contingency tables with respect to the intrinsic priors withdifferent degree of concentration around the null, and comparewith other “robust” results by Good and Crook. Consistency ofthe intrinsic Bayesian tests is established.We also discuss conditioning issues and sampling schemes,and argue that conditioning should be on either one margin orthe table total, but not on both margins.Examples using real are simulated data are given
Resumo:
Objectives: The study objective was to derive reference pharmacokinetic curves of antiretroviral drugs (ART) based on available population pharmacokinetic (Pop-PK) studies that can be used to optimize therapeutic drug monitoring guided dosage adjustment.¦Methods: A systematic search of Pop-PK studies of 8 ART in adults was performed in PubMed. To simulate reference PK curves, a summary of the PK parameters was obtained for each drug based on meta-analysis approach. Most models used one-compartment model, thus chosen as reference model. Models using bi-exponential disposition were simplified to one-compartment, since the first distribution phase was rapid and not determinant for the description of the terminal elimination phase, mostly relevant for this project. Different absorption were standardized for first-order absorption processes.¦Apparent clearance (CL), apparent volume of distribution of the terminal phase (Vz) and absorption rate constant (ka) and inter-individual variability were pooled into summary mean value, weighted by number of plasma levels; intra-individual variability was weighted by number of individuals in each study.¦Simulations based on summary PK parameters served to construct concentration PK percentiles (NONMEM®).¦Concordance between individual and summary parameters was assessed graphically using Forest-plots. To test robustness, difference in simulated curves based on published and summary parameters was calculated using efavirenz as probe drug.¦Results: CL was readily accessible from all studies. For studies with one-compartment, Vz was central volume of distribution; for two-compartment, Vz was CL/λz. ka was directly used or derived based on the mean absorption time (MAT) for more complicated absorption models, assuming MAT=1/ka.¦The value of CL for each drug was in excellent agreement throughout all Pop-PK models, suggesting that minimal concentration derived from summary models was adequately characterized. The comparison of the concentration vs. time profile for efavirenz between published and summary PK parameters revealed not more than 20% difference. Although our approach appears adequate for estimation of elimination phase, the simplification of absorption phase might lead to small bias shortly after drug intake.¦Conclusions: Simulated reference percentile curves based on such an approach represent a useful tool for interpretating drug concentrations. This Pop-PK meta-analysis approach should be further validated and could be extended to elaborate more sophisticated computerized tool for the Bayesian TDM of ART.
Resumo:
The nervous system is a frequent target of industrial chemicals, pharmaceuticals, and environmental pollutants. To screen large numbers of compounds for their neurotoxic potential, in vitro systems are required which combine organ-specific traits with robustness and high reproducibility. These requirements are met by serum-free aggregating brain cell cultures derived from mechanically dissociated embryonic rat brain. The initial cell suspension, composed of neural stem cells, neural progenitor cells, immature postmitotic neurons, glioblasts, and microglial cells, is kept under continuous gyratory agitation. Spherical aggregates form spontaneously and are maintained in suspension culture for several weeks. Within the aggregates, the cells rearrange and mature, reproducing critical morphogenic events such as migration, proliferation, differentiation, synaptogenesis, and myelination. In addition to the spontaneous reconstitution of histotypic brain architecture, the cultures acquire organ-specific functionality as indicated by activity-dependent glucose consumption, spontaneous electrical activity, and brain-specific inflammatory responses. These three-dimensional primary cell cultures offer therefore a unique model for neurotoxicity testing both during development and at advanced cellular differentiation. The high number of aggregates available and the excellent reproducibility of the cultures facilitate routine test procedures. This chapter presents a detailed description of the preparation and maintenance of these cultures as well as their use for routine toxicity testing.
Resumo:
This paper is a joint effort between five institutionsthat introduces several novel similarity measures andcombines them to carry out a multimodal segmentationevaluation. The new similarity measures proposed arebased on the location and the intensity values of themisclassified voxels as well as on the connectivity andthe boundaries of the segmented data. We showexperimentally that the combination of these measuresimprove the quality of the evaluation. The study that weshow here has been carried out using four differentsegmentation methods from four different labs applied toa MRI simulated dataset of the brain. We claim that ournew measures improve the robustness of the evaluation andprovides better understanding about the differencebetween segmentation methods.
Real-Time implementation of a blind authentication method using self-synchronous speech watermarking
Resumo:
A blind speech watermarking scheme that meets hard real-time deadlines is presented and implemented. In addition, one of the key issues in these block-oriented watermarking techniques is to preserve the synchronization. Namely, to recover the exact position of each block in the mark extract process. In fact, the presented scheme can be split up into two distinguished parts, the synchronization and the information mark methods. The former is embedded into the time domain and it is fast enough to be run meeting real-time requirements. The latter contains the authentication information and it is embedded into the wavelet domain. The synchronization and information mark techniques are both tunable in order to allow a con gurable method. Thus, capacity, transparency and robustness can be con gured depending on the needs. It makes the scheme useful for professional applications, such telephony authentication or even sending information throw radio applications.
Resumo:
BACKGROUND Functional brain images such as Single-Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) have been widely used to guide the clinicians in the Alzheimer's Disease (AD) diagnosis. However, the subjectivity involved in their evaluation has favoured the development of Computer Aided Diagnosis (CAD) Systems. METHODS It is proposed a novel combination of feature extraction techniques to improve the diagnosis of AD. Firstly, Regions of Interest (ROIs) are selected by means of a t-test carried out on 3D Normalised Mean Square Error (NMSE) features restricted to be located within a predefined brain activation mask. In order to address the small sample-size problem, the dimension of the feature space was further reduced by: Large Margin Nearest Neighbours using a rectangular matrix (LMNN-RECT), Principal Component Analysis (PCA) or Partial Least Squares (PLS) (the two latter also analysed with a LMNN transformation). Regarding the classifiers, kernel Support Vector Machines (SVMs) and LMNN using Euclidean, Mahalanobis and Energy-based metrics were compared. RESULTS Several experiments were conducted in order to evaluate the proposed LMNN-based feature extraction algorithms and its benefits as: i) linear transformation of the PLS or PCA reduced data, ii) feature reduction technique, and iii) classifier (with Euclidean, Mahalanobis or Energy-based methodology). The system was evaluated by means of k-fold cross-validation yielding accuracy, sensitivity and specificity values of 92.78%, 91.07% and 95.12% (for SPECT) and 90.67%, 88% and 93.33% (for PET), respectively, when a NMSE-PLS-LMNN feature extraction method was used in combination with a SVM classifier, thus outperforming recently reported baseline methods. CONCLUSIONS All the proposed methods turned out to be a valid solution for the presented problem. One of the advances is the robustness of the LMNN algorithm that not only provides higher separation rate between the classes but it also makes (in combination with NMSE and PLS) this rate variation more stable. In addition, their generalization ability is another advance since several experiments were performed on two image modalities (SPECT and PET).
Mutigrid preconditioner for nonconforming discretization of elliptic problems with jump coefficients
Resumo:
In this paper, we present a multigrid preconditioner for solving the linear system arising from the piecewise linear nonconforming Crouzeix-Raviart discretization of second order elliptic problems with jump coe fficients. The preconditioner uses the standard conforming subspaces as coarse spaces. Numerical tests show both robustness with respect to the jump in the coe fficient and near-optimality with respect to the number of degrees of freedom.
Resumo:
Gene-on-gene regulations are key components of every living organism. Dynamical abstract models of genetic regulatory networks help explain the genome's evolvability and robustness. These properties can be attributed to the structural topology of the graph formed by genes, as vertices, and regulatory interactions, as edges. Moreover, the actual gene interaction of each gene is believed to play a key role in the stability of the structure. With advances in biology, some effort was deployed to develop update functions in Boolean models that include recent knowledge. We combine real-life gene interaction networks with novel update functions in a Boolean model. We use two sub-networks of biological organisms, the yeast cell-cycle and the mouse embryonic stem cell, as topological support for our system. On these structures, we substitute the original random update functions by a novel threshold-based dynamic function in which the promoting and repressing effect of each interaction is considered. We use a third real-life regulatory network, along with its inferred Boolean update functions to validate the proposed update function. Results of this validation hint to increased biological plausibility of the threshold-based function. To investigate the dynamical behavior of this new model, we visualized the phase transition between order and chaos into the critical regime using Derrida plots. We complement the qualitative nature of Derrida plots with an alternative measure, the criticality distance, that also allows to discriminate between regimes in a quantitative way. Simulation on both real-life genetic regulatory networks show that there exists a set of parameters that allows the systems to operate in the critical region. This new model includes experimentally derived biological information and recent discoveries, which makes it potentially useful to guide experimental research. The update function confers additional realism to the model, while reducing the complexity and solution space, thus making it easier to investigate.