14 resultados para Multiple methods framework
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
The integration of nanostructured films containing biomolecules and silicon-based technologies is a promising direction for reaching miniaturized biosensors that exhibit high sensitivity and selectivity. A challenge, however, is to avoid cross talk among sensing units in an array with multiple sensors located on a small area. In this letter, we describe an array of 16 sensing units, of a light-addressable potentiometric sensor (LAPS), which was made with layer-by-Layer (LbL) films of a poly(amidomine) dendrimer (PAMAM) and single-walled carbon nanotubes (SWNTs), coated with a layer of the enzyme penicillinase. A visual inspection of the data from constant-current measurements with liquid samples containing distinct concentrations of penicillin, glucose, or a buffer indicated a possible cross talk between units that contained penicillinase and those that did not. With the use of multidimensional data projection techniques, normally employed in information Visualization methods, we managed to distinguish the results from the modified LAPS, even in cases where the units were adjacent to each other. Furthermore, the plots generated with the interactive document map (IDMAP) projection technique enabled the distinction of the different concentrations of penicillin, from 5 mmol L(-1) down to 0.5 mmol L(-1). Data visualization also confirmed the enhanced performance of the sensing units containing carbon nanotubes, consistent with the analysis of results for LAPS sensors. The use of visual analytics, as with projection methods, may be essential to handle a large amount of data generated in multiple sensor arrays to achieve high performance in miniaturized systems.
Resumo:
Nesse artigo, tem-se o interesse em avaliar diferentes estratégias de estimação de parâmetros para um modelo de regressão linear múltipla. Para a estimação dos parâmetros do modelo foram utilizados dados de um ensaio clínico em que o interesse foi verificar se o ensaio mecânico da propriedade de força máxima (EM-FM) está associada com a massa femoral, com o diâmetro femoral e com o grupo experimental de ratas ovariectomizadas da raça Rattus norvegicus albinus, variedade Wistar. Para a estimação dos parâmetros do modelo serão comparadas três metodologias: a metodologia clássica, baseada no método dos mínimos quadrados; a metodologia Bayesiana, baseada no teorema de Bayes; e o método Bootstrap, baseado em processos de reamostragem.
Resumo:
The advent of the Auger Engineering Radio Array (AERA) necessitates the development of a powerful framework for the analysis of radio measurements of cosmic ray air showers. As AERA performs ""radio-hybrid"" measurements of air shower radio emission in coincidence with the surface particle detectors and fluorescence telescopes of the Pierre Auger Observatory, the radio analysis functionality had to be incorporated in the existing hybrid analysis solutions for fluorescence and surface detector data. This goal has been achieved in a natural way by extending the existing Auger Offline software framework with radio functionality. In this article, we lay out the design, highlights and features of the radio extension implemented in the Auger Offline framework. Its functionality has achieved a high degree of sophistication and offers advanced features such as vectorial reconstruction of the electric field, advanced signal processing algorithms, a transparent and efficient handling of FFTs, a very detailed simulation of detector effects, and the read-in of multiple data formats including data from various radio simulation codes. The source code of this radio functionality can be made available to interested parties on request. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Objectives. To study mortality trends related to Chagas disease taking into account all mentions of this cause listed on any line or part of the death certificate. Methods. Mortality data for 1985-2006 were obtained from the multiple cause-of-death database maintained by the Sao Paulo State Data Analysis System (SEADE). Chagas disease was classified as the underlying cause-of-death or as an associated cause-of-death (non-underlying). The total number of times Chagas disease was mentioned on the death certificates was also considered. Results. During this 22-year period, there were 40 002 deaths related to Chagas disease: 34 917 (87.29%) classified as the underlying cause-of-death and 5 085 (12.71%) as an associated cause-of-death. The results show a 56.07% decline in the death rate due to Chagas disease as the underlying cause and a stabilized rate as associated cause. The number of deaths was 44.5% higher among men. The fact that 83.5% of the deaths occurred after 45 years of age reflects a cohort effect. The main causes associated with Chagas disease as the underlying cause-of-death were direct complications due to cardiac involvement, such as conduction disorders, arrhythmias and heart failure. Ischemic heart disease, cerebrovascular disorders and neoplasms were the main underlying causes when Chagas was an associated cause-of-death. Conclusions. For the total mentions to Chagas disease, a 51.34% decline in the death rate was observed, whereas the decline in the number of deaths was only 5.91%, being lower among women and showing a shift of deaths to older age brackets. Using the multiple cause-of-death method contributed to the understanding of the natural history of Chagas disease.
Resumo:
The ejection of gas out of the disc in late-type galaxies is related to star formation and is mainly due to the explosion of Type II supernovae (SN II). In a previous paper, we considered the evolution of a single Galactic fountain, that is, a fountain powered by a single SN cluster. Using three-dimensional hydrodynamical simulations, we studied in detail the fountain flow and its dependence with several factors, such as the Galactic rotation, the distance to the Galactic centre and the presence of a hot gaseous halo. As a natural followup, this paper investigates the dynamical evolution of multiple generations of fountains generated by similar to 100 OB associations. We have considered the observed size-frequency distribution of young stellar clusters within the Galaxy in order to appropriately fuel the multiple fountains in our simulations. Most of the results of the previous paper have been confirmed, like for example the formation of intermediate velocity clouds above the disc by the multiple fountains. Also, this work confirms the localized nature of the fountain flows: the freshly ejected metals tend to fall back close to the same Galactocentric region where they are delivered. Therefore, the fountains do not change significantly the radial profile of the disc chemical abundance. The multiple fountain simulations also allowed us to consistently calculate the feedback of the star formation on the halo gas. We found that the hot gas gains about 10 per cent of all the SN II energy produced in the disc. Thus, the SN feedback more than compensate for the halo radiative losses and allow for a quasi steady-state disc-halo circulation to exist. Finally, we have also considered the possibility of mass infall from the intergalactic medium and its interaction with the clouds that are formed by the fountains. Though our simulations are not suitable to reproduce the slow rotational pattern that is typically observed in the haloes around the disc galaxies, they indicate that the presence of an external gas infall may help to slow down the rotation of the gas in the clouds and thus the amount of angular momentum that they transfer to the coronal gas, as previously suggested in the literature.
Resumo:
Predictive performance evaluation is a fundamental issue in design, development, and deployment of classification systems. As predictive performance evaluation is a multidimensional problem, single scalar summaries such as error rate, although quite convenient due to its simplicity, can seldom evaluate all the aspects that a complete and reliable evaluation must consider. Due to this, various graphical performance evaluation methods are increasingly drawing the attention of machine learning, data mining, and pattern recognition communities. The main advantage of these types of methods resides in their ability to depict the trade-offs between evaluation aspects in a multidimensional space rather than reducing these aspects to an arbitrarily chosen (and often biased) single scalar measure. Furthermore, to appropriately select a suitable graphical method for a given task, it is crucial to identify its strengths and weaknesses. This paper surveys various graphical methods often used for predictive performance evaluation. By presenting these methods in the same framework, we hope this paper may shed some light on deciding which methods are more suitable to use in different situations.
Resumo:
In this paper, we consider some non-homogeneous Poisson models to estimate the probability that an air quality standard is exceeded a given number of times in a time interval of interest. We assume that the number of exceedances occurs according to a non-homogeneous Poisson process (NHPP). This Poisson process has rate function lambda(t), t >= 0, which depends on some parameters that must be estimated. We take into account two cases of rate functions: the Weibull and the Goel-Okumoto. We consider models with and without change-points. When the presence of change-points is assumed, we may have the presence of either one, two or three change-points, depending of the data set. The parameters of the rate functions are estimated using a Gibbs sampling algorithm. Results are applied to ozone data provided by the Mexico City monitoring network. In a first instance, we assume that there are no change-points present. Depending on the adjustment of the model, we assume the presence of either one, two or three change-points. Copyright (C) 2009 John Wiley & Sons, Ltd.
Resumo:
In many data sets from clinical studies there are patients insusceptible to the occurrence of the event of interest. Survival models which ignore this fact are generally inadequate. The main goal of this paper is to describe an application of the generalized additive models for location, scale, and shape (GAMLSS) framework to the fitting of long-term survival models. in this work the number of competing causes of the event of interest follows the negative binomial distribution. In this way, some well known models found in the literature are characterized as particular cases of our proposal. The model is conveniently parameterized in terms of the cured fraction, which is then linked to covariates. We explore the use of the gamlss package in R as a powerful tool for inference in long-term survival models. The procedure is illustrated with a numerical example. (C) 2009 Elsevier Ireland Ltd. All rights reserved.
Resumo:
This paper deals with the classical one-dimensional integer cutting stock problem, which consists of cutting a set of available stock lengths in order to produce smaller ordered items. This process is carried out in order to optimize a given objective function (e.g., minimizing waste). Our study deals with a case in which there are several stock lengths available in limited quantities. Moreover, we have focused on problems of low demand. Some heuristic methods are proposed in order to obtain an integer solution and compared with others. The heuristic methods are empirically analyzed by solving a set of randomly generated instances and a set of instances from the literature. Concerning the latter. most of the optimal solutions of these instances are known, therefore it was possible to compare the solutions. The proposed methods presented very small objective function value gaps. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
In this paper we present a novel approach for multispectral image contextual classification by combining iterative combinatorial optimization algorithms. The pixel-wise decision rule is defined using a Bayesian approach to combine two MRF models: a Gaussian Markov Random Field (GMRF) for the observations (likelihood) and a Potts model for the a priori knowledge, to regularize the solution in the presence of noisy data. Hence, the classification problem is stated according to a Maximum a Posteriori (MAP) framework. In order to approximate the MAP solution we apply several combinatorial optimization methods using multiple simultaneous initializations, making the solution less sensitive to the initial conditions and reducing both computational cost and time in comparison to Simulated Annealing, often unfeasible in many real image processing applications. Markov Random Field model parameters are estimated by Maximum Pseudo-Likelihood (MPL) approach, avoiding manual adjustments in the choice of the regularization parameters. Asymptotic evaluations assess the accuracy of the proposed parameter estimation procedure. To test and evaluate the proposed classification method, we adopt metrics for quantitative performance assessment (Cohen`s Kappa coefficient), allowing a robust and accurate statistical analysis. The obtained results clearly show that combining sub-optimal contextual algorithms significantly improves the classification performance, indicating the effectiveness of the proposed methodology. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
This work describes a novel methodology for automatic contour extraction from 2D images of 3D neurons (e.g. camera lucida images and other types of 2D microscopy). Most contour-based shape analysis methods cannot be used to characterize such cells because of overlaps between neuronal processes. The proposed framework is specifically aimed at the problem of contour following even in presence of multiple overlaps. First, the input image is preprocessed in order to obtain an 8-connected skeleton with one-pixel-wide branches, as well as a set of critical regions (i.e., bifurcations and crossings). Next, for each subtree, the tracking stage iteratively labels all valid pixel of branches, tip to a critical region, where it determines the suitable direction to proceed. Finally, the labeled skeleton segments are followed in order to yield the parametric contour of the neuronal shape under analysis. The reported system was successfully tested with respect to several images and the results from a set of three neuron images are presented here, each pertaining to a different class, i.e. alpha, delta and epsilon ganglion cells, containing a total of 34 crossings. The algorithms successfully got across all these overlaps. The method has also been found to exhibit robustness even for images with close parallel segments. The proposed method is robust and may be implemented in an efficient manner. The introduction of this approach should pave the way for more systematic application of contour-based shape analysis methods in neuronal morphology. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
We review several asymmetrical links for binary regression models and present a unified approach for two skew-probit links proposed in the literature. Moreover, under skew-probit link, conditions for the existence of the ML estimators and the posterior distribution under improper priors are established. The framework proposed here considers two sets of latent variables which are helpful to implement the Bayesian MCMC approach. A simulation study to criteria for models comparison is conducted and two applications are made. Using different Bayesian criteria we show that, for these data sets, the skew-probit links are better than alternative links proposed in the literature.
Resumo:
Approximate Lie symmetries of the Navier-Stokes equations are used for the applications to scaling phenomenon arising in turbulence. In particular, we show that the Lie symmetries of the Euler equations are inherited by the Navier-Stokes equations in the form of approximate symmetries that allows to involve the Reynolds number dependence into scaling laws. Moreover, the optimal systems of all finite-dimensional Lie subalgebras of the approximate symmetry transformations of the Navier-Stokes are constructed. We show how the scaling groups obtained can be used to introduce the Reynolds number dependence into scaling laws explicitly for stationary parallel turbulent shear flows. This is demonstrated in the framework of a new approach to derive scaling laws based on symmetry analysis [11]-[13].
Resumo:
Let (M, g) be a complete Riemannian manifold, Omega subset of Man open subset whose closure is homeomorphic to an annulus. We prove that if a,Omega is smooth and it satisfies a strong concavity assumption, then there are at least two distinct geodesics in starting orthogonally to one connected component of a,Omega and arriving orthogonally onto the other one. Using the results given in Giamb et al. (Adv Differ Equ 10:931-960, 2005), we then obtain a proof of the existence of two distinct homoclinic orbits for an autonomous Lagrangian system emanating from a nondegenerate maximum point of the potential energy, and a proof of the existence of two distinct brake orbits for a class of Hamiltonian systems. Under a further symmetry assumption, the result is improved by showing the existence of at least dim(M) pairs of geometrically distinct geodesics as above, brake orbits and homoclinic orbits. In our proof we shall use recent deformation results proved in Giamb et al. (Nonlinear Anal Ser A: Theory Methods Appl 73:290-337, 2010).