15 resultados para Numerical integration methods

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we propose a random intercept Poisson model in which the random effect is assumed to follow a generalized log-gamma (GLG) distribution. This random effect accommodates (or captures) the overdispersion in the counts and induces within-cluster correlation. We derive the first two moments for the marginal distribution as well as the intraclass correlation. Even though numerical integration methods are, in general, required for deriving the marginal models, we obtain the multivariate negative binomial model from a particular parameter setting of the hierarchical model. An iterative process is derived for obtaining the maximum likelihood estimates for the parameters in the multivariate negative binomial model. Residual analysis is proposed and two applications with real data are given for illustration. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Study aim. - We describe a new neuronavigation-guided technique to target the posterior-superior insula (PSI) using a cooled-double-cone coil for deep cortical stimulation. Introduction. - Despite the analgesic effects brought about by repetitive transcranial magnetic stimulation (TMS) to the primary motor and prefrontal cortices, a significant proportion of patients remain symptomatic. This encouraged the search for new targets that may provide stronger pain relief. There is growing evidence that the posterior insula is implicated in the integration of painful stimuli in different pain syndromes and in homeostatic thermal integration. Methods. - The primary motor cortex representation of the lower leg was used to calculate the motor threshold and thus, estimate the intensity of PSI stimulation. Results. - Seven healthy volunteers were stimulated at 10 Hz to the right PSI and showed subjective changes in cold perception. The technique was safe and well tolerated. Conclusions. - The right posterior-superior insula is worth being considered in future studies as a possible target for rTMS stimulation in chronic pain patients. (c) 2012 Elsevier Masson SAS. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Some atomic multipoles (charges, dipoles and quadrupoles) from the Quantum Theory of Atoms in Molecules (QTAIM) and CHELPG charges are used to investigate interactions between a proton and a molecule (F2, Cl2, BF, AlF, BeO, MgO, LiH, H2CO, NH3, PH3, BF3, and CO2). Calculations were done at the B3LYP/6-311G(3d,3p) level. The main aspect of this work is the investigation of polarization effects over electrostatic potentials and atomic multipoles along a medium to long range of interaction distances. Large electronic charge fluxes and polarization changes are induced by a proton mainly when this positive particle approaches the least electronegative atom of diatomic heteronuclear molecules. The search for simple equations to describe polarization on electrostatic potentials from QTAIM quantities resulted in linear relations with r-4 (r is the interaction distance) for many cases. Moreover, the contribution from atomic dipoles to these potentials is usually the most affected contribution by polarization what reinforces the need for these dipoles to a minimal description of purely electrostatic interactions. Finally, CHELPG charges provide a description of polarization effects on electrostatic potentials that is in disagreement with physical arguments for certain of these molecules. (c) 2012 Wiley Periodicals, Inc.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Robust analysis of vector fields has been established as an important tool for deriving insights from the complex systems these fields model. Traditional analysis and visualization techniques rely primarily on computing streamlines through numerical integration. The inherent numerical errors of such approaches are usually ignored, leading to inconsistencies that cause unreliable visualizations and can ultimately prevent in-depth analysis. We propose a new representation for vector fields on surfaces that replaces numerical integration through triangles with maps from the triangle boundaries to themselves. This representation, called edge maps, permits a concise description of flow behaviors and is equivalent to computing all possible streamlines at a user defined error threshold. Independent of this error streamlines computed using edge maps are guaranteed to be consistent up to floating point precision, enabling the stable extraction of features such as the topological skeleton. Furthermore, our representation explicitly stores spatial and temporal errors which we use to produce more informative visualizations. This work describes the construction of edge maps, the error quantification, and a refinement procedure to adhere to a user defined error bound. Finally, we introduce new visualizations using the additional information provided by edge maps to indicate the uncertainty involved in computing streamlines and topological structures.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this work, different methods to estimate the value of thin film residual stresses using instrumented indentation data were analyzed. This study considered procedures proposed in the literature, as well as a modification on one of these methods and a new approach based on the effect of residual stress on the value of hardness calculated via the Oliver and Pharr method. The analysis of these methods was centered on an axisymmetric two-dimensional finite element model, which was developed to simulate instrumented indentation testing of thin ceramic films deposited onto hard steel substrates. Simulations were conducted varying the level of film residual stress, film strain hardening exponent, film yield strength, and film Poisson's ratio. Different ratios of maximum penetration depth h(max) over film thickness t were also considered, including h/t = 0.04, for which the contribution of the substrate in the mechanical response of the system is not significant. Residual stresses were then calculated following the procedures mentioned above and compared with the values used as input in the numerical simulations. In general, results indicate the difference that each method provides with respect to the input values depends on the conditions studied. The method by Suresh and Giannakopoulos consistently overestimated the values when stresses were compressive. The method provided by Wang et al. has shown less dependence on h/t than the others.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: A current challenge in gene annotation is to define the gene function in the context of the network of relationships instead of using single genes. The inference of gene networks (GNs) has emerged as an approach to better understand the biology of the system and to study how several components of this network interact with each other and keep their functions stable. However, in general there is no sufficient data to accurately recover the GNs from their expression levels leading to the curse of dimensionality, in which the number of variables is higher than samples. One way to mitigate this problem is to integrate biological data instead of using only the expression profiles in the inference process. Nowadays, the use of several biological information in inference methods had a significant increase in order to better recover the connections between genes and reduce the false positives. What makes this strategy so interesting is the possibility of confirming the known connections through the included biological data, and the possibility of discovering new relationships between genes when observed the expression data. Although several works in data integration have increased the performance of the network inference methods, the real contribution of adding each type of biological information in the obtained improvement is not clear. Methods: We propose a methodology to include biological information into an inference algorithm in order to assess its prediction gain by using biological information and expression profile together. We also evaluated and compared the gain of adding four types of biological information: (a) protein-protein interaction, (b) Rosetta stone fusion proteins, (c) KEGG and (d) KEGG+GO. Results and conclusions: This work presents a first comparison of the gain in the use of prior biological information in the inference of GNs by considering the eukaryote (P. falciparum) organism. Our results indicates that information based on direct interaction can produce a higher improvement in the gain than data about a less specific relationship as GO or KEGG. Also, as expected, the results show that the use of biological information is a very important approach for the improvement of the inference. We also compared the gain in the inference of the global network and only the hubs. The results indicates that the use of biological information can improve the identification of the most connected proteins.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the past few decades detailed observations of radio and X-ray emission from massive binary systems revealed a whole new physics present in such systems. Both thermal and non-thermal components of this emission indicate that most of the radiation at these bands originates in shocks. O and B-type stars and WolfRayet (WR) stars present supersonic and massive winds that, when colliding, emit largely due to the freefree radiation. The non-thermal radio and X-ray emissions are due to synchrotron and inverse Compton processes, respectively. In this case, magnetic fields are expected to play an important role in the emission distribution. In the past few years the modelling of the freefree and synchrotron emissions from massive binary systems have been based on purely hydrodynamical simulations, and ad hoc assumptions regarding the distribution of magnetic energy and the field geometry. In this work we provide the first full magnetohydrodynamic numerical simulations of windwind collision in massive binary systems. We study the freefree emission characterizing its dependence on the stellar and orbital parameters. We also study self-consistently the evolution of the magnetic field at the shock region, obtaining also the synchrotron energy distribution integrated along different lines of sight. We show that the magnetic field in the shocks is larger than that obtained when the proportionality between B and the plasma density is assumed. Also, we show that the role of the synchrotron emission relative to the total radio emission has been underestimated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this work was to evaluate extreme water table depths in a watershed, using methods for geographical spatial data analysis. Groundwater spatio-temporal dynamics was evaluated in an outcrop of the Guarani Aquifer System. Water table depths were estimated from monitoring of water levels in 23 piezometers and time series modeling available from April 2004 to April 2011. For generation of spatial scenarios, geostatistical techniques were used, which incorporated into the prediction ancillary information related to the geomorphological patterns of the watershed, using a digital elevation model. This procedure improved estimates, due to the high correlation between water levels and elevation, and aggregated physical sense to predictions. The scenarios showed differences regarding the extreme levels - too deep or too shallow ones - and can subsidize water planning, efficient water use, and sustainable water management in the watershed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents numerical simulations of two fluid flow problems involving moving free surfaces: the impacting drop and fluid jet buckling. The viscoelastic model used in these simulations is the eXtended Pom-Pom (XPP) model. To validate the code, numerical predictions of the drop impact problem for Newtonian and Oldroyd-B fluids are presented and compared with other methods. In particular, a benchmark on numerical simulations for a XPP drop impacting on a rigid plate is performed for a wide range of the relevant parameters. Finally, to provide an additional application of free surface flows of XPP fluids, the viscous jet buckling problem is simulated and discussed. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The stability of two recently developed pressure spaces has been assessed numerically: The space proposed by Ausas et al. [R.F. Ausas, F.S. Sousa, G.C. Buscaglia, An improved finite element space for discontinuous pressures, Comput. Methods Appl. Mech. Engrg. 199 (2010) 1019-1031], which is capable of representing discontinuous pressures, and the space proposed by Coppola-Owen and Codina [A.H. Coppola-Owen, R. Codina, Improving Eulerian two-phase flow finite element approximation with discontinuous gradient pressure shape functions, Int. J. Numer. Methods Fluids, 49 (2005) 1287-1304], which can represent discontinuities in pressure gradients. We assess the stability of these spaces by numerically computing the inf-sup constants of several meshes. The inf-sup constant results as the solution of a generalized eigenvalue problems. Both spaces are in this way confirmed to be stable in their original form. An application of the same numerical assessment tool to the stabilized equal-order P-1/P-1 formulation is then reported. An interesting finding is that the stabilization coefficient can be safely set to zero in an arbitrary band of elements without compromising the formulation's stability. An analogous result is also reported for the mini-element P-1(+)/P-1 when the velocity bubbles are removed in an arbitrary band of elements. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: The aim of this study was to assess the epidemiological and operational characteristics of the Leprosy Program before and after its integration into the Primary Healthcare Services of the municipality of Aracaju-Sergipe, Brazil. Methods: Data were drawn from the national database. The study periods were divided into preintegration (1996-2000) and postintegration (2001-2007). Annual rates of epidemiological detection were calculated. Frequency data on clinico-epidemiological variables of cases detected and treated for the two periods were compared using the Chi-squared (chi(2)) test adopting a 5% level of significance. Results: Rates of detection overall, and in subjects younger than 15 years, were greater for the postintegration period and were higher than rates recorded for Brazil as a whole during the same periods. A total of 780 and 1,469 cases were registered during the preintegration and postintegration periods, respectively. Observations for the postintegration period were as follows: I) a higher proportion of cases with disability grade assessed at diagnosis, with increase of 60.9% to 78.8% (p < 0.001), and at end of treatment, from 41.4% to 44.4% (p < 0.023); II) an increase in proportion of cases detected by contact examination, from 2.1% to 4.1% (p < 0.001); and III) a lower level of treatment default with a decrease from 5.64 to 3.35 (p < 0.008). Only 34% of cases registered from 2001 to 2007 were examined. Conclusions: The shift observed in rates of detection overall, and in subjects younger than 15 years, during the postintegration period indicate an increased level of health care access. The fall in number of patients abandoning treatment indicates greater adherence to treatment. However, previous shortcomings in key actions, pivotal to attaining the outcomes and impact envisaged for the program, persisted in the postintegration period.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background The use of the knowledge produced by sciences to promote human health is the main goal of translational medicine. To make it feasible we need computational methods to handle the large amount of information that arises from bench to bedside and to deal with its heterogeneity. A computational challenge that must be faced is to promote the integration of clinical, socio-demographic and biological data. In this effort, ontologies play an essential role as a powerful artifact for knowledge representation. Chado is a modular ontology-oriented database model that gained popularity due to its robustness and flexibility as a generic platform to store biological data; however it lacks supporting representation of clinical and socio-demographic information. Results We have implemented an extension of Chado – the Clinical Module - to allow the representation of this kind of information. Our approach consists of a framework for data integration through the use of a common reference ontology. The design of this framework has four levels: data level, to store the data; semantic level, to integrate and standardize the data by the use of ontologies; application level, to manage clinical databases, ontologies and data integration process; and web interface level, to allow interaction between the user and the system. The clinical module was built based on the Entity-Attribute-Value (EAV) model. We also proposed a methodology to migrate data from legacy clinical databases to the integrative framework. A Chado instance was initialized using a relational database management system. The Clinical Module was implemented and the framework was loaded using data from a factual clinical research database. Clinical and demographic data as well as biomaterial data were obtained from patients with tumors of head and neck. We implemented the IPTrans tool that is a complete environment for data migration, which comprises: the construction of a model to describe the legacy clinical data, based on an ontology; the Extraction, Transformation and Load (ETL) process to extract the data from the source clinical database and load it in the Clinical Module of Chado; the development of a web tool and a Bridge Layer to adapt the web tool to Chado, as well as other applications. Conclusions Open-source computational solutions currently available for translational science does not have a model to represent biomolecular information and also are not integrated with the existing bioinformatics tools. On the other hand, existing genomic data models do not represent clinical patient data. A framework was developed to support translational research by integrating biomolecular information coming from different “omics” technologies with patient’s clinical and socio-demographic data. This framework should present some features: flexibility, compression and robustness. The experiments accomplished from a use case demonstrated that the proposed system meets requirements of flexibility and robustness, leading to the desired integration. The Clinical Module can be accessed in http://dcm.ffclrp.usp.br/caib/pg=iptrans webcite.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: The aim of this study was to assess the epidemiological and operational characteristics of the Leprosy Program before and after its integration into the Primary healthcare Services of the municipality of Aracaju-Sergipe, Brazil. METHODS: Data were drawn from the national database. The study periods were divided into preintegration (1996-2000) and postintegration (2001-2007). Annual rates of epidemiological detection were calculated. Frequency data on clinico-epidemiological variables of cases detected and treated for the two periods were compared using the Chi-squared (χ2) test adopting a 5% level of significance. RESULTS: Rates of detection overall, and in subjects younger than 15 years, were greater for the postintegration period and were higher than rates recorded for Brazil as a whole during the same periods. A total of 780 and 1,469 cases were registered during the preintegration and postintegration periods, respectively. Observations for the postintegration period were as follows: I) a higher proportion of cases with disability grade assessed at diagnosis, with increase of 60.9% to 78.8% (p < 0.001), and at end of treatment, from 41.4% to 44.4% (p < 0.023); II) an increase in proportion of cases detected by contact examination, from 2.1% to 4.1% (p < 0.001); and III) a lower level of treatment default with a decrease from 5.64 to 3.35 (p < 0.008). Only 34% of cases registered from 2001 to 2007 were examined. CONCLUSIONS: The shift observed in rates of detection overall, and in subjects younger than 15 years, during the postintegration period indicate an increased level of health care access. The fall in number of patients abandoning treatment indicates greater adherence to treatment. However, previous shortcomings in key actions, pivotal to attaining the outcomes and impact envisaged for the program, persisted in the postintegration period.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The boundary layer over concave surfaces can be unstable due to centrifugal forces, giving rise to Goertler vortices. These vortices create two regions in the spanwise direction—the upwash and downwash regions. The downwash region is responsible for compressing the boundary layer toward the wall, increasing the heat transfer rate. The upwash region does the opposite. In the nonlinear development of the Goertler vortices, it can be observed that the upwash region becomes narrow and the spanwise–average heat transfer rate is higher than that for a Blasius boundary layer. This paper analyzes the influence of the spanwise wavelength of the Goertler the heat transfer. The equation is written in vorticity-velocity formulation. The time integration is done via a classical fourth-order Runge-Kutta method. The spatial derivatives are calculated using high-order compact finite difference and spectral methods. Three different wavelengths are analyzed. The results show that steady Goertler flow can increase the heat transfer rates to values close to the values of turbulence, without the existence of a secondary instability. The geometry (and computation domain) are presented

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hermite interpolation is increasingly showing to be a powerful numerical solution tool, as applied to different kinds of second order boundary value problems. In this work we present two Hermite finite element methods to solve viscous incompressible flows problems, in both two- and three-dimension space. In the two-dimensional case we use the Zienkiewicz triangle to represent the velocity field, and in the three-dimensional case an extension of this element to tetrahedra, still called a Zienkiewicz element. Taking as a model the Stokes system, the pressure is approximated with continuous functions, either piecewise linear or piecewise quadratic, according to the version of the Zienkiewicz element in use, that is, with either incomplete or complete cubics. The methods employ both the standard Galerkin or the Petrov–Galerkin formulation first proposed in Hughes et al. (1986) [18], based on the addition of a balance of force term. A priori error analyses point to optimal convergence rates for the PG approach, and for the Galerkin formulation too, at least in some particular cases. From the point of view of both accuracy and the global number of degrees of freedom, the new methods are shown to have a favorable cost-benefit ratio, as compared to velocity Lagrange finite elements of the same order, especially if the Galerkin approach is employed.