48 resultados para Computational Geometry and Object Modeling


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tropical ecosystems play a large and complex role in the global carbon cycle. Clearing of natural ecosystems for agriculture leads to large pulses of CO(2) to the atmosphere from terrestrial biomass. Concurrently, the remaining intact ecosystems, especially tropical forests, may be sequestering a large amount of carbon from the atmosphere in response to global environmental changes including climate changes and an increase in atmospheric CO(2). Here we use an approach that integrates census-based historical land use reconstructions, remote-sensing-based contemporary land use change analyses, and simulation modeling of terrestrial biogeochemistry to estimate the net carbon balance over the period 1901-2006 for the state of Mato Grosso, Brazil, which is one of the most rapidly changing agricultural frontiers in the world. By the end of this period, we estimate that of the state`s 925 225 km(2), 221 092 km(2) have been converted to pastures and 89 533 km(2) have been converted to croplands, with forest-to-pasture conversions being the dominant land use trajectory but with recent transitions to croplands increasing rapidly in the last decade. These conversions have led to a cumulative release of 4.8 Pg C to the atmosphere, with similar to 80% from forest clearing and 20% from the clearing of cerrado. Over the same period, we estimate that the residual undisturbed ecosystems accumulated 0.3 Pg C in response to CO2 fertilization. Therefore, the net emissions of carbon from Mato Grosso over this period were 4.5 Pg C. Net carbon emissions from Mato Grosso since 2000 averaged 146 Tg C/yr, on the order of Brazil`s fossil fuel emissions during this period. These emissions were associated with the expansion of croplands to grow soybeans. While alternative management regimes in croplands, including tillage, fertilization, and cropping patterns promote carbon storage in ecosystems, they remain a small portion of the net carbon balance for the region. This detailed accounting of a region`s carbon balance is the type of foundation analysis needed by the new United Nations Collaborative Programmme for Reducing Emissions from Deforestation and Forest Degradation (REDD).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to the worldwide increase in demand for biofuels, the area cultivated with sugarcane is expected to increase. For environmental and economic reasons, an increasing proportion of the areas are being harvested without burning, leaving the residues on the soil surface. This periodical input of residues affects soil physical, chemical and biological properties, as well as plant growth and nutrition. Modeling can be a useful tool in the study of the complex interactions between the climate, residue quality, and the biological factors controlling plant growth and residue decomposition. The approach taken in this work was to parameterize the CENTURY model for the sugarcane crop, to simulate the temporal dynamics of aboveground phytomass and litter decomposition, and to validate the model through field experiment data. When studying aboveground growth, burned and unburned harvest systems were compared, as well as the effect of mineral fertilizer and organic residue applications. The simulations were performed with data from experiments with different durations, from 12 months to 60 years, in Goiana, TimbaA(0)ba and Pradpolis, Brazil; Harwood, Mackay and Tully, Australia; and Mount Edgecombe, South Africa. The differentiation of two pools in the litter, with different decomposition rates, was found to be a relevant factor in the simulations made. Originally, the model had a basically unlimited layer of mulch directly available for decomposition, 5,000 g m(-2). Through a parameter optimization process, the thickness of the mulch layer closer to the soil, more vulnerable to decomposition, was set as 110 g m(-2). By changing the layer of mulch at any given time available for decomposition, the sugarcane residues decomposition simulations where close to measured values (R (2) = 0.93), contributing to making the CENTURY model a tool for the study of sugarcane litter decomposition patterns. The CENTURY model accurately simulated aboveground carbon stalk values (R (2) = 0.76), considering burned and unburned harvest systems, plots with and without nitrogen fertilizer and organic amendment applications, in different climates and soil conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fatigue and crack propagation are phenomena affected by high uncertainties, where deterministic methods fail to predict accurately the structural life. The present work aims at coupling reliability analysis with boundary element method. The latter has been recognized as an accurate and efficient numerical technique to deal with mixed mode propagation, which is very interesting for reliability analysis. The coupled procedure allows us to consider uncertainties during the crack growth process. In addition, it computes the probability of fatigue failure for complex structural geometry and loading. Two coupling procedures are considered: direct coupling of reliability and mechanical solvers and indirect coupling by the response surface method. Numerical applications show the performance of the proposed models in lifetime assessment under uncertainties, where the direct method has shown faster convergence than response surface method. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The most ordinary finite element formulations for 3D frame analysis do not consider the warping of cross-sections as part of their kinematics. So the stiffness, regarding torsion, should be directly introduced by the user into the computational software and the bar is treated as it is working under no warping hypothesis. This approach does not give good results for general structural elements applied in engineering. Both displacement and stress calculation reveal sensible deficiencies for both linear and non-linear applications. For linear analysis, displacements can be corrected by assuming a stiffness that results in acceptable global displacements of the analyzed structure. However, the stress calculation will be far from reality. For nonlinear analysis the deficiencies are even worse. In the past forty years, some special structural matrix analysis and finite element formulations have been proposed in literature to include warping and the bending-torsion effects for 3D general frame analysis considering both linear and non-linear situations. In this work, using a kinematics improvement technique, the degree of freedom ""warping intensity"" is introduced following a new approach for 3D frame elements. This degree of freedom is associated with the warping basic mode, a geometric characteristic of the cross-section, It does not have a direct relation with the rate of twist rotation along the longitudinal axis, as in existent formulations. Moreover, a linear strain variation mode is provided for the geometric non-linear approach, for which complete 3D constitutive relation (Saint-Venant Kirchhoff) is adopted. The proposed technique allows the consideration of inhomogeneous cross-sections with any geometry. Various examples are shown to demonstrate the accuracy and applicability of the proposed formulation. (C) 2009 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An improvement to the quality bidimensional Delaunay mesh generation algorithm, which combines the mesh refinement algorithms strategy of Ruppert and Shewchuk is proposed in this research. The developed technique uses diametral lenses criterion, introduced by L. P. Chew, with the purpose of eliminating the extremely obtuse triangles in the boundary mesh. This method splits the boundary segment and obtains an initial prerefinement, and thus reducing the number of necessary iterations to generate a high quality sequential triangulation. Moreover, it decreases the intensity of the communication and synchronization between subdomains in parallel mesh refinement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conventional procedures used to assess the integrity of corroded piping systems with axial defects generally employ simplified failure criteria based upon a plastic collapse failure mechanism incorporating the tensile properties of the pipe material. These methods establish acceptance criteria for defects based on limited experimental data for low strength structural steels which do not necessarily address specific requirements for the high grade steels currently used. For these cases, failure assessments may be overly conservative or provide significant scatter in their predictions, which lead to unnecessary repair or replacement of in-service pipelines. Motivated by these observations, this study examines the applicability of a stress-based criterion based upon plastic instability analysis to predict the failure pressure of corroded pipelines with axial defects. A central focus is to gain additional insight into effects of defect geometry and material properties on the attainment of a local limit load to support the development of stress-based burst strength criteria. The work provides an extensive body of results which lend further support to adopt failure criteria for corroded pipelines based upon ligament instability analyses. A verification study conducted on burst testing of large-diameter pipe specimens with different defect length shows the effectiveness of a stress-based criterion using local ligament instability in burst pressure predictions, even though the adopted burst criterion exhibits a potential dependence on defect geometry and possibly on material`s strain hardening capacity. Overall, the results presented here suggests that use of stress-based criteria based upon plastic instability analysis of the defect ligament is a valid engineering tool for integrity assessments of pipelines with axial corroded defects. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development and fabrication of a thermo-electro-optic sensor using a Mach-Zehnder interferometer and a resistive micro-heater placed in one of the device`s arms is presented. The Mach-Zehnder structure was fabricated on a single crystal silicon substrate using silicon oxynitride and amorphous hydrogenated silicon carbide films to form an anti-resonant reflective optical waveguide. The materials were deposited by Plasma enhanced chemical vapor deposition technique at low temperatures (similar to 320 degrees C). To optimize the heat transfer and increase the device response with current variation, part of the Mach-Zehnder sensor arm was suspended through front-side bulk micromachining of the silicon substrate in a KOH solution. With the temperature variation caused by the micro-heater, the refractive index of the core layer of the optical waveguide changes due to the thermo-optic effect. Since this variation occurs only in one of the Mach-Zehnder`s arm, a phase difference between the arms is produced, leading to electromagnetic interference. In this way, the current applied to the micro-resistor can control the device output optical power. Further, reactive ion etching technique was used in this work to define the device`s geometry, and a study of SF6 based etching rates on different composition of silicon oxynitride films is also presented. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The preslaughter handling and transport of broilers are stressful operations that might affect welfare and meat quality and could increase numbers of deaths before slaughter. However, the influence of thermal factors during transportation and lairage at slaughterhouses is complex in subtropical regions, where increasing temperature and high RH are the major concerns regarding animal survival before slaughter. In this study we assessed the influence of a controlled lairage environment on preslaughter mortality rates of broiler chickens that were transported during different seasons of the year and had varying lairage times in the subtropical climate. Preslaughter data from 13,937 broiler flocks were recorded daily during 2006 in a commercial slaughterhouse in southeastern Brazil. The main factors that influenced daily mortality rate were mean dry bulb temperature and RH, lairage time, daily periods, density of broilers per crate, season of the year, stocking density per lorry, transport time, and distance between farms and slaughterhouse. A holding area at the slaughterhouse with environmental control was assessed. Using a double GLM for mean and dispersion modeling, the seasons were found to have significant effects (P < 0.05) on average mortality rates. The highest incidence was observed in summer (0.42%), followed by spring (0.39%), winter (0.28%), and autumn (0.23%). A decrease of preslaughter mortality of broilers during summer (P < 0.05) was observed when the lairage time was increased, mainly after 1 h of exposure to a controlled environment. Thus, lairage for 3 to 4 h in a controlled lairage environment during the summer and spring is necessary to reduce the thermal load of broiler chickens.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Degenerative aortic valve disease (DAVD), a common finding in the elderly, is associated with an increased risk of death due to cardiovascular causes. Taking advantage of its longitudinal design, this study evaluates the prevalence of DAVD and its temporal associations with long-term exposure to cardiovascular risk factors in the general population. We studied 953 subjects (aged 25-74 years) from a random sample of German residents. Risk factors had been determined at a baseline investigation in 1994/95. At a follow-up investigation, 10 years later, standardized echocardiography determined aortic valve morphology and aortic valve area (AVA) as well as left ventricular geometry and function. At the follow-up study, the overall prevalence of DAVD was 28%. In logistic regression models adjusting for traditional cardiovascular risk factors at baseline age (OR 2.0 [1.7-2.3] per 10 years, P < 0.001), active smoking (OR 1.7 [1.1-2.4], P = 0.009) and elevated total cholesterol levels (OR 1.2 [1.1-1.3] per increase of 20 mg/dL, P < 0.001) were significantly related to DAVD at follow-up. Furthermore, age, baseline status of smoking, and total cholesterol level were significant predictors of a smaller AVA at follow-up study. In contrast, hypertension and obesity had no detectable relationship with long-term changes of aortic valve structure. In the general population we observed a high prevalence of DAVD that is associated with long-term exposure to elevated cholesterol levels and active smoking. These findings strengthen the notion that smoking cessation and cholesterol lowering are promising treatment targets for prevention of DAVD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The human brain is often considered to be the most cognitively capable among mammalian brains and to be much larger than expected for a mammal of our body size. Although the number of neurons is generally assumed to be a determinant of computational power, and despite the widespread quotes that the human brain contains 100 billion neurons and ten times more glial cells, the absolute number of neurons and glial cells in the human brain remains unknown. Here we determine these numbers by using the isotropic fractionator and compare them with the expected values for a human-sized primate. We find that the adult male human brain contains on average 86.1 +/- 8.1 billion NeuN-positive cells (""neurons"") and 84.6 +/- 9.8 billion NeuN-negative (""nonneuronal"") cells. With only 19% of all neurons located in the cerebral cortex, greater cortical size (representing 82% of total brain mass) in humans compared with other primates does not reflect an increased relative number of cortical neurons. The ratios between glial cells and neurons in the human brain structures are similar to those found in other primates, and their numbers of cells match those expected for a primate of human proportions. These findings challenge the common view that humans stand out from other primates in their brain composition and indicate that, with regard to numbers of neuronal and nonneuronal cells, the human brain is an isometrically scaled-up primate brain. J. Comp. Neurol. 513:532-541, 2009. (c) 2009 Wiley-Liss, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Solving multicommodity capacitated network design problems is a hard task that requires the use of several strategies like relaxing some constraints and strengthening the model with valid inequalities. In this paper, we compare three sets of inequalities that have been widely used in this context: Benders, metric and cutset inequalities. We show that Benders inequalities associated to extreme rays are metric inequalities. We also show how to strengthen Benders inequalities associated to non-extreme rays to obtain metric inequalities. We show that cutset inequalities are Benders inequalities, but not necessarily metric inequalities. We give a necessary and sufficient condition for a cutset inequality to be a metric inequality. Computational experiments show the effectiveness of strengthening Benders and cutset inequalities to obtain metric inequalities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The final contents of total and individual trans-fatty acids of sunflower oil, produced during the deacidification step of physical refining were obtained using a computational simulation program that considered cis-trans isomerization reaction features for oleic, linoleic, and linolenic acids attached to the glycerol part of triacylglycerols. The impact of process variables, such as temperature and liquid flow rate, and of equipment configuration parameters, such as liquid height, diameter, and number of stages, that influence the retention time of the oil in the equipment was analyzed using the response-surface methodology (RSM). The computational simulation and the RSM results were used in two different optimization methods, aiming to minimize final levels of total and individual trans-fatty acids (trans-FA), while keeping neutral oil loss and final oil acidity at low values. The main goal of this work was to indicate that computational simulation, based on a careful modeling of the reaction system, combined with optimization could be an important tool for indicating better processing conditions in industrial physical refining plants of vegetable oils, concerning trans-FA formation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study, the molecular structure and conformational analyses of the 4-isopropylthioxanthone (4-ITX) are reported according to experimental and theoretical results. The compound crystallizes in the centrosymmetric P (1) over bar space group with only one molecule in the asymmetric unit, presenting the most stable conformation, in which the three fused-rings adopt a planar geometry, and the isopropyl group assumes a torsional angle with less sterical hindrance. The structural and conformational analyses were performed using theoretical calculations such as Hartree-Fock (HF), DFT method in combination with 6-311G(d,p) and 6-31++G(d,p) and the results were compared with infrared spectroscopy (FT-IR) and X-ray diffraction (XRD). The supramolecular assembly of 4-ITX is kept by non-classical C-H center dot center dot center dot O hydrogen bonds and weak interactions such as pi-pi stacking. 4-ITX was also studied by (1)H and (13)C NMR spectroscopy. UV-Vis absorption spectroscopic properties of the 4-ITX showed the long-wavelength maximum shifts towards high energy when the solvent polarity increases. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the first phase of a project attempting to construct an efficient general-purpose nonlinear optimizer using an augmented Lagrangian outer loop with a relative error criterion, and an inner loop employing a state-of-the art conjugate gradient solver. The outer loop can also employ double regularized proximal kernels, a fairly recent theoretical development that leads to fully smooth subproblems. We first enhance the existing theory to show that our approach is globally convergent in both the primal and dual spaces when applied to convex problems. We then present an extensive computational evaluation using the CUTE test set, showing that some aspects of our approach are promising, but some are not. These conclusions in turn lead to additional computational experiments suggesting where to next focus our theoretical and computational efforts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A Nonlinear Programming algorithm that converges to second-order stationary points is introduced in this paper. The main tool is a second-order negative-curvature method for box-constrained minimization of a certain class of functions that do not possess continuous second derivatives. This method is used to define an Augmented Lagrangian algorithm of PHR (Powell-Hestenes-Rockafellar) type. Convergence proofs under weak constraint qualifications are given. Numerical examples showing that the new method converges to second-order stationary points in situations in which first-order methods fail are exhibited.