12 resultados para Numerical approximation and analysis
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
This work used the colloidal theory to describe forces and energy interactions of colloidal complexes in the water and those formed during filtration run in direct filtration. Many interactions of particle energy profiles between colloidal surfaces for three geometries are presented here in: spherical, plate and cylindrical; and four surface interactions arrangements: two cylinders, two spheres, two plates and a sphere and a plate. Two different situations were analyzed, before and after electrostatic destabilization by action of the alum sulfate as coagulant in water studies samples prepared with kaolin. In the case were used mathematical modeling by extended DLVO theory (from the names: Derjarguin-Landau-Verwey-Overbeek) or XDLVO, which include traditional approach of the electric double layer (EDL), surfaces attraction forces or London-van der Waals (LvdW), esteric forces and hydrophobic forces, additionally considering another forces in colloidal system, like molecular repulsion or Born Repulsion and Acid-Base (AB) chemical function forces from Lewis.
Resumo:
Different representations for a control surface freeplay nonlinearity in a three degree of freedom aeroelastic system are assessed. These are the discontinuous, polynomial and hyperbolic tangent representations. The Duhamel formulation is used to model the aerodynamic loads. Assessment of the validity of these representations is performed through comparison with previous experimental observations. The results show that the instability and nonlinear response characteristics are accurately predicted when using the discontinuous and hyperbolic tangent representations. On the other hand, the polynomial representation fails to predict chaotic motions observed in the experiments. (c) 2012 Elsevier Ltd. All rights reserved.
Resumo:
Integrative review of Brazilian studies about evidence-based practices (EBP) about prevention in human health, published in Web of Science/JCR journals, between October 2010 and April 2011. The aim was to identify the specialties that most accomplished these studies, their foci and methodological approaches. Based on inclusion criteria, 84 studies were selected, mainly published in public health journals, focusing on primary care and also addressing clinical issues and different specialties. Prevention foci and methodological approaches also varied, with a predominance of systematic reviews without meta-analysis. The results indicate that there is no single way to conceptualize and practice EBP in the field of prevention, and that its application may not only serve to obtain indisputable evidence to equip intervention actions. This endless knowledge area is under construction, with a view to the analysis and further understanding of health phenomena.
Resumo:
This paper presents a technique for performing analog design synthesis at circuit level providing feedback to the designer through the exploration of the Pareto frontier. A modified simulated annealing which is able to perform crossover with past anchor points when a local minimum is found which is used as the optimization algorithm on the initial synthesis procedure. After all specifications are met, the algorithm searches for the extreme points of the Pareto frontier in order to obtain a non-exhaustive exploration of the Pareto front. Finally, multi-objective particle swarm optimization is used to spread the results and to find a more accurate frontier. Piecewise linear functions are used as single-objective cost functions to produce a smooth and equal convergence of all measurements to the desired specifications during the composition of the aggregate objective function. To verify the presented technique two circuits were designed, which are: a Miller amplifier with 96 dB Voltage gain, 15.48 MHz unity gain frequency, slew rate of 19.2 V/mu s with a current supply of 385.15 mu A, and a complementary folded cascode with 104.25 dB Voltage gain, 18.15 MHz of unity gain frequency and a slew rate of 13.370 MV/mu s. These circuits were synthesized using a 0.35 mu m technology. The results show that the method provides a fast approach for good solutions using the modified SA and further good Pareto front exploration through its connection to the particle swarm optimization algorithm.
Resumo:
Background: Acute respiratory distress syndrome (ARDS) is associated with high in-hospital mortality. Alveolar recruitment followed by ventilation at optimal titrated PEEP may reduce ventilator-induced lung injury and improve oxygenation in patients with ARDS, but the effects on mortality and other clinical outcomes remain unknown. This article reports the rationale, study design, and analysis plan of the Alveolar Recruitment for ARDS Trial (ART). Methods/Design: ART is a pragmatic, multicenter, randomized (concealed), controlled trial, which aims to determine if maximum stepwise alveolar recruitment associated with PEEP titration is able to increase 28-day survival in patients with ARDS compared to conventional treatment (ARDSNet strategy). We will enroll adult patients with ARDS of less than 72 h duration. The intervention group will receive an alveolar recruitment maneuver, with stepwise increases of PEEP achieving 45 cmH(2)O and peak pressure of 60 cmH2O, followed by ventilation with optimal PEEP titrated according to the static compliance of the respiratory system. In the control group, mechanical ventilation will follow a conventional protocol (ARDSNet). In both groups, we will use controlled volume mode with low tidal volumes (4 to 6 mL/kg of predicted body weight) and targeting plateau pressure <= 30 cmH2O. The primary outcome is 28-day survival, and the secondary outcomes are: length of ICU stay; length of hospital stay; pneumothorax requiring chest tube during first 7 days; barotrauma during first 7 days; mechanical ventilation-free days from days 1 to 28; ICU, in-hospital, and 6-month survival. ART is an event-guided trial planned to last until 520 events (deaths within 28 days) are observed. These events allow detection of a hazard ratio of 0.75, with 90% power and two-tailed type I error of 5%. All analysis will follow the intention-to-treat principle. Discussion: If the ART strategy with maximum recruitment and PEEP titration improves 28-day survival, this will represent a notable advance to the care of ARDS patients. Conversely, if the ART strategy is similar or inferior to the current evidence-based strategy (ARDSNet), this should also change current practice as many institutions routinely employ recruitment maneuvers and set PEEP levels according to some titration method.
Resumo:
In this work, an experimental and numerical analysis and characterization of functionally graded structures (FGSs) is developed. Nickel (Ni) and copper (Cu) materials are used as basic materials in the numerical modeling and experimental characterization. For modeling, a MATLAB finite element code is developed, which allows simulation of harmonic and modal analysis considering the graded finite element formulation. For experimental characterization, Ni-Cu FGSs are manufactured by using spark plasma sintering technique. Hardness and Young's modulus are found by using microindentation and ultrasonic measurements, respectively. The effective gradation of Ni/Cu FGS is addressed by means of optical microscopy, energy dispersive spectrometry, scanning electron microscopy and hardness testing. For the purpose of comparing modeling and experimental results, the hardness curve, along the gradation direction, is used for identifying the gradation profile; accordingly, the experimental hardness curve is used for approximating the Young's modulus variation and the graded finite element modeling is used for verification. For the first two resonance frequency values, a difference smaller than 1% between simulated and experimental results is obtained. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
Analytical and numerical analyses of the nonlinear response of a three-degree-of-freedom nonlinear aeroelastic system are performed. Particularly, the effects of concentrated structural nonlinearities on the different motions are determined. The concentrated nonlinearities are introduced in the pitch, plunge, and flap springs by adding cubic stiffness in each of them. Quasi-steady approximation and the Duhamel formulation are used to model the aerodynamic loads. Using the quasi-steady approach, we derive the normal form of the Hopf bifurcation associated with the system's instability. Using the nonlinear form, three configurations including supercritical and subcritical aeroelastic systems are defined and analyzed numerically. The characteristics of these different configurations in terms of stability and motions are evaluated. The usefulness of the two aerodynamic formulations in the prediction of the different motions beyond the bifurcation is discussed.
Resumo:
Warrick and Hussen developed in the nineties of the last century a method to scale Richards' equation (RE) for similar soils. In this paper, new scaled solutions are added to the method of Warrick and Hussen considering a wider range of soils regardless of their dissimilarity. Gardner-Kozeny hydraulic functions are adopted instead of Brooks-Corey functions used originally by Warrick and Hussen. These functions allow to reduce the dependence of the scaled RE on the soil properties. To evaluate the proposed method (PM), the scaled RE was solved numerically using a finite difference method with a fully implicit scheme. Three cases were considered: constant-head infiltration, constant-flux infiltration, and drainage of an initially uniform wet soil. The results for five texturally different soils ranging from sand to clay (adopted from the literature) showed that the scaled solutions were invariant to a satisfactory degree. However, slight deviations were observed mainly for the sandy soil. Moreover, the scaled solutions deviated when the soil profile was initially wet in the infiltration case or when deeply wet in the drainage condition. Based on the PM, a Philip-type model was also developed to approximate RE solutions for the constant-head infiltration. The model showed a good agreement with the scaled RE for the same range of soils and conditions, however only for Gardner-Kozeny soils. Such a procedure reduces numerical calculations and provides additional opportunities for solving the highly nonlinear RE for unsaturated water flow in soils. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Objective: The aim of this study was to compare the correspondence between gap formation and apical microleakage in root canals filled with epoxy resin-based (AH Plus) combined or not with resinous primer or with a dimethacrylate-based root canal sealer (Epiphany). Material and Methods: Thirty-nine lower single-rooted human premolars were filled by the lateral condensation technique (LC) and immersed in a 50-wt% aqueous silver nitrate solution at 37 degrees C (24 h). After longitudinal sectioning, epoxy resin replicas were made from the tooth specimens. Both the replicas and the specimens were prepared for scanning electron microscopy (SEM). The gaps were observed in the replicas. Apical microleakage was detected in the specimens by SEM/energy dispersive spectroscopy (SEM/EDS). The data were analyzed statistically using an Ordinal Logistic Regression model and Analysis of Correspondence (alpha=0.05). Results: Epiphany presented more regions containing gaps between dentin and sealer (p<0.05). There was correspondence between the presence of gaps and microleakage (p<0.05). Microleakage was similar among the root-filling materials (p>0.05). Conclusions: The resinous primer did not improve the sealing ability of AH Plus sealer and the presence of gaps had an effect on apical microleakage for all materials.
Resumo:
The flow around circular smooth fixed cylinder in a large range of Reynolds numbers is considered in this paper. In order to investigate this canonical case, we perform CFD calculations and apply verification & validation (V&V) procedures to draw conclusions regarding numerical error and, afterwards, assess the modeling errors and capabilities of this (U)RANS method to solve the problem. Eight Reynolds numbers between Re = 10 and Re 5 x 10(5) will be presented with, at least, four geometrically similar grids and five discretization in time for each case (when unsteady), together with strict control of iterative and round-off errors, allowing a consistent verification analysis with uncertainty estimation. Two-dimensional RANS, steady or unsteady, laminar or turbulent calculations are performed. The original 1994 k - omega SST turbulence model by Menter is used to model turbulence. The validation procedure is performed by comparing the numerical results with an extensive set of experimental results compiled from the literature. [DOI: 10.1115/1.4007571]
Resumo:
OBJECTIVE: To evaluate the prevalence of the urinary excretion of BKV and JCV in HIV-infected patients without neurological symptoms. METHODS: Urine samples from HIV-infected patients without neurological symptoms were tested for JC virus and BK virus by PCR. Samples were screened for the presence of polyomavirus with sets of primers complementary to the early region of JCV and BKV genome (AgT). The presence of JC virus or BK virus were confirmed by two other PCR assays using sets of primers complementary to the VP1 gene of each virus. Analysis of the data was performed by the Kruskal-Wallis test for numerical data and Pearson or Yates for categorical variables. RESULTS: A total of 75 patients were included in the study. The overall prevalence of polyomavirus DNA urinary shedding was 67/75 (89.3%). Only BKV DNA was detected in 14/75 (18.7%) urine samples, and only JCV DNA was detected in 11/75 (14.7%) samples. Both BKV and JCV DNA were present in 42/75 (56.0%) samples. CONCLUSION: In this study we found high rates of excretion of JCV, BKV, and simultaneous excretion in HIV+ patients. Also these results differ from the others available on the literature.
Resumo:
Abstract Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data.