63 resultados para robustness
Resumo:
In this work we introduce a new hierarchical surface decomposition method for multiscale analysis of surface meshes. In contrast to other multiresolution methods, our approach relies on spectral properties of the surface to build a binary hierarchical decomposition. Namely, we utilize the first nontrivial eigenfunction of the Laplace-Beltrami operator to recursively decompose the surface. For this reason we coin our surface decomposition the Fiedler tree. Using the Fiedler tree ensures a number of attractive properties, including: mesh-independent decomposition, well-formed and nearly equi-areal surface patches, and noise robustness. We show how the evenly distributed patches can be exploited for generating multiresolution high quality uniform meshes. Additionally, our decomposition permits a natural means for carrying out wavelet methods, resulting in an intuitive method for producing feature-sensitive meshes at multiple scales. Published by Elsevier Ltd.
Resumo:
The representation of interfaces by means of the algebraic moving-least-squares (AMLS) technique is addressed. This technique, in which the interface is represented by an unconnected set of points, is interesting for evolving fluid interfaces since there is]to surface connectivity. The position of the surface points can thus be updated without concerns about the quality of any surface triangulation. We introduce a novel AMLS technique especially designed for evolving-interfaces applications that we denote RAMLS (for Robust AMLS). The main advantages with respect to previous AMLS techniques are: increased robustness, computational efficiency, and being free of user-tuned parameters. Further, we propose a new front-tracking method based on the Lagrangian advection of the unconnected point set that defines the RAMLS surface. We assume that a background Eulerian grid is defined with some grid spacing h. The advection of the point set makes the surface evolve in time. The point cloud can be regenerated at any time (in particular, we regenerate it each time step) by intersecting the gridlines with the evolved surface, which guarantees that the density of points on the surface is always well balanced. The intersection algorithm is essentially a ray-tracing algorithm, well-studied in computer graphics, in which a line (ray) is traced so as to detect all intersections with a surface. Also, the tracing of each gridline is independent and can thus be performed in parallel. Several tests are reported assessing first the accuracy of the proposed RAMLS technique, and then of the front-tracking method based on it. Comparison with previous Eulerian, Lagrangian and hybrid techniques encourage further development of the proposed method for fluid mechanics applications. (C) 2008 Elsevier Inc. All rights reserved.
Resumo:
In this Letter we deal with a nonlinear Schrodinger equation with chaotic, random, and nonperiodic cubic nonlinearity. Our goal is to study the soliton evolution, with the strength of the nonlinearity perturbed in the space and time coordinates and to check its robustness under these conditions. Here we show that the chaotic perturbation is more effective in destroying the soliton behavior, when compared with random or nonperiodic perturbation. For a real system, the perturbation can be related to, e.g., impurities in crystalline structures, or coupling to a thermal reservoir which, on the average, enhances the nonlinearity. We also discuss the relevance of such random perturbations to the dynamics of Bose-Einstein condensates and their collective excitations and transport. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
5-HT(1A) receptor plays an important role in the delayed onset of antidepressant action of a class of selective serotonin reuptake inhibitors. Moreover, 5-HT(1A) receptor levels have been shown to be altered in patients suffering from major depression. In this work, hologram quantitative structure-activity relationship (HQSAR) studies were performed on a series of arylpiperazine compounds presenting affinity to the 5-HT(1A) receptor. The models were constructed with a training set of 70 compounds. The most significant HQSAR model (q(2) = 0.81, r(2) = 0.96) was generated using atoms, bonds, connections, chirality, and donor and acceptor as fragment distinction, with fragment size of 6-9. Predictions for an external test set containing 20 compounds are in good agreement with experimental results showing the robustness of the model. Additionally, useful information can be obtained from the 2D contribution maps.
Resumo:
Several protease inhibitors have reached the world market in the last fifteen years, dramatically improving the quality of life and life expectancy of millions of HIV-infected patients. In spite of the tremendous research efforts in this area, resistant HIV-1 variants are constantly decreasing the ability of the drugs to efficiently inhibit the enzyme. As a consequence, inhibitors with novel frameworks are necessary to circumvent resistance to chemotherapy. In the present work, we have created 3D QSAR models for a series of 82 HIV-1 protease inhibitors employing the comparative molecular field analysis (CoMFA) method. Significant correlation coefficients were obtained (q(2) = 0.82 and r(2) = 0.97), indicating the internal consistency of the best model, which was then used to evaluate an external test set containing 17 compounds. The predicted values were in good agreement with the experimental results, showing the robustness of the model and its substantial predictive power for untested compounds. The final QSAR model and the information gathered from the CoMFA contour maps should be useful for the design of novel anti-HIV agents with improved potency.
Resumo:
The relationship between the structure and function of biological networks constitutes a fundamental issue in systems biology. Particularly, the structure of protein-protein interaction networks is related to important biological functions. In this work, we investigated how such a resilience is determined by the large scale features of the respective networks. Four species are taken into account, namely yeast Saccharomyces cerevisiae, worm Caenorhabditis elegans, fly Drosophila melanogaster and Homo sapiens. We adopted two entropy-related measurements (degree entropy and dynamic entropy) in order to quantify the overall degree of robustness of these networks. We verified that while they exhibit similar structural variations under random node removal, they differ significantly when subjected to intentional attacks (hub removal). As a matter of fact, more complex species tended to exhibit more robust networks. More specifically, we quantified how six important measurements of the networks topology (namely clustering coefficient, average degree of neighbors, average shortest path length, diameter, assortativity coefficient, and slope of the power law degree distribution) correlated with the two entropy measurements. Our results revealed that the fraction of hubs and the average neighbor degree contribute significantly for the resilience of networks. In addition, the topological analysis of the removed hubs indicated that the presence of alternative paths between the proteins connected to hubs tend to reinforce resilience. The performed analysis helps to understand how resilience is underlain in networks and can be applied to the development of protein network models.
Resumo:
This paper proposes a method to locate and track people by combining evidence from multiple cameras using the homography constraint. The proposed method use foreground pixels from simple background subtraction to compute evidence of the location of people on a reference ground plane. The algorithm computes the amount of support that basically corresponds to the ""foreground mass"" above each pixel. Therefore, pixels that correspond to ground points have more support. The support is normalized to compensate for perspective effects and accumulated on the reference plane for all camera views. The detection of people on the reference plane becomes a search for regions of local maxima in the accumulator. Many false positives are filtered by checking the visibility consistency of the detected candidates against all camera views. The remaining candidates are tracked using Kalman filters and appearance models. Experimental results using challenging data from PETS`06 show good performance of the method in the presence of severe occlusion. Ground truth data also confirms the robustness of the method. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
This work describes a novel methodology for automatic contour extraction from 2D images of 3D neurons (e.g. camera lucida images and other types of 2D microscopy). Most contour-based shape analysis methods cannot be used to characterize such cells because of overlaps between neuronal processes. The proposed framework is specifically aimed at the problem of contour following even in presence of multiple overlaps. First, the input image is preprocessed in order to obtain an 8-connected skeleton with one-pixel-wide branches, as well as a set of critical regions (i.e., bifurcations and crossings). Next, for each subtree, the tracking stage iteratively labels all valid pixel of branches, tip to a critical region, where it determines the suitable direction to proceed. Finally, the labeled skeleton segments are followed in order to yield the parametric contour of the neuronal shape under analysis. The reported system was successfully tested with respect to several images and the results from a set of three neuron images are presented here, each pertaining to a different class, i.e. alpha, delta and epsilon ganglion cells, containing a total of 34 crossings. The algorithms successfully got across all these overlaps. The method has also been found to exhibit robustness even for images with close parallel segments. The proposed method is robust and may be implemented in an efficient manner. The introduction of this approach should pave the way for more systematic application of contour-based shape analysis methods in neuronal morphology. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
In this paper we extend partial linear models with normal errors to Student-t errors Penalized likelihood equations are applied to derive the maximum likelihood estimates which appear to be robust against outlying observations in the sense of the Mahalanobis distance In order to study the sensitivity of the penalized estimates under some usual perturbation schemes in the model or data the local influence curvatures are derived and some diagnostic graphics are proposed A motivating example preliminary analyzed under normal errors is reanalyzed under Student-t errors The local influence approach is used to compare the sensitivity of the model estimates (C) 2010 Elsevier B V All rights reserved
Resumo:
The Birnbaum-Saunders (BS) model is a positively skewed statistical distribution that has received great attention in recent decades. A generalized version of this model was derived based on symmetrical distributions in the real line named the generalized BS (GBS) distribution. The R package named gbs was developed to analyze data from GBS models. This package contains probabilistic and reliability indicators and random number generators from GBS distributions. Parameter estimates for censored and uncensored data can also be obtained by means of likelihood methods from the gbs package. Goodness-of-fit and diagnostic methods were also implemented in this package in order to check the suitability of the GBS models. in this article, the capabilities and features of the gbs package are illustrated by using simulated and real data sets. Shape and reliability analyses for GBS models are presented. A simulation study for evaluating the quality and sensitivity of the estimation method developed in the package is provided and discussed. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
The Grubbs` measurement model is frequently used to compare several measuring devices. It is common to assume that the random terms have a normal distribution. However, such assumption makes the inference vulnerable to outlying observations, whereas scale mixtures of normal distributions have been an interesting alternative to produce robust estimates, keeping the elegancy and simplicity of the maximum likelihood theory. The aim of this paper is to develop an EM-type algorithm for the parameter estimation, and to use the local influence method to assess the robustness aspects of these parameter estimates under some usual perturbation schemes, In order to identify outliers and to criticize the model building we use the local influence procedure in a Study to compare the precision of several thermocouples. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
We introduce in this paper the class of linear models with first-order autoregressive elliptical errors. The score functions and the Fisher information matrices are derived for the parameters of interest and an iterative process is proposed for the parameter estimation. Some robustness aspects of the maximum likelihood estimates are discussed. The normal curvatures of local influence are also derived for some usual perturbation schemes whereas diagnostic graphics to assess the sensitivity of the maximum likelihood estimates are proposed. The methodology is applied to analyse the daily log excess return on the Microsoft whose empirical distributions appear to have AR(1) and heavy-tailed errors. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
The aim of this article is to discuss the estimation of the systematic risk in capital asset pricing models with heavy-tailed error distributions to explain the asset returns. Diagnostic methods for assessing departures from the model assumptions as well as the influence of observations on the parameter estimates are also presented. It may be shown that outlying observations are down weighted in the maximum likelihood equations of linear models with heavy-tailed error distributions, such as Student-t, power exponential, logistic II, so on. This robustness aspect may also be extended to influential observations. An application in which the systematic risk estimate of Microsoft is compared under normal and heavy-tailed errors is presented for illustration.
Resumo:
The generalized Birnbaum-Saunders (GBS) distribution is a new class of positively skewed models with lighter and heavier tails than the traditional Birnbaum-Saunders (BS) distribution, which is largely applied to study lifetimes. However, the theoretical argument and the interesting properties of the GBS model have made its application possible beyond the lifetime analysis. The aim of this paper is to present the GBS distribution as a useful model for describing pollution data and deriving its positive and negative moments. Based on these moments, we develop estimation and goodness-of-fit methods. Also, some properties of the proposed estimators useful for developing asymptotic inference are presented. Finally, an application with real data from Environmental Sciences is given to illustrate the methodology developed. This example shows that the empirical fit of the GBS distribution to the data is very good. Thus, the GBS model is appropriate for describing air pollutant concentration data, which produces better results than the lognormal model when the administrative target is determined for abating air pollution. Copyright (c) 2007 John Wiley & Sons, Ltd.
Resumo:
The immersed boundary method is a versatile tool for the investigation of flow-structure interaction. In a large number of applications, the immersed boundaries or structures are very stiff and strong tangential forces on these interfaces induce a well-known, severe time-step restriction for explicit discretizations. This excessive stability constraint can be removed with fully implicit or suitable semi-implicit schemes but at a seemingly prohibitive computational cost. While economical alternatives have been proposed recently for some special cases, there is a practical need for a computationally efficient approach that can be applied more broadly. In this context, we revisit a robust semi-implicit discretization introduced by Peskin in the late 1970s which has received renewed attention recently. This discretization, in which the spreading and interpolation operators are lagged. leads to a linear system of equations for the inter-face configuration at the future time, when the interfacial force is linear. However, this linear system is large and dense and thus it is challenging to streamline its solution. Moreover, while the same linear system or one of similar structure could potentially be used in Newton-type iterations, nonlinear and highly stiff immersed structures pose additional challenges to iterative methods. In this work, we address these problems and propose cost-effective computational strategies for solving Peskin`s lagged-operators type of discretization. We do this by first constructing a sufficiently accurate approximation to the system`s matrix and we obtain a rigorous estimate for this approximation. This matrix is expeditiously computed by using a combination of pre-calculated values and interpolation. The availability of a matrix allows for more efficient matrix-vector products and facilitates the design of effective iterative schemes. We propose efficient iterative approaches to deal with both linear and nonlinear interfacial forces and simple or complex immersed structures with tethered or untethered points. One of these iterative approaches employs a splitting in which we first solve a linear problem for the interfacial force and then we use a nonlinear iteration to find the interface configuration corresponding to this force. We demonstrate that the proposed approach is several orders of magnitude more efficient than the standard explicit method. In addition to considering the standard elliptical drop test case, we show both the robustness and efficacy of the proposed methodology with a 2D model of a heart valve. (C) 2009 Elsevier Inc. All rights reserved.