48 resultados para solve


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The applicability of a meshfree approximation method, namely the EFG method, on fully geometrically exact analysis of plates is investigated. Based on a unified nonlinear theory of plates, which allows for arbitrarily large rotations and displacements, a Galerkin approximation via MLS functions is settled. A hybrid method of analysis is proposed, where the solution is obtained by the independent approximation of the generalized internal displacement fields and the generalized boundary tractions. A consistent linearization procedure is performed, resulting in a semi-definite generalized tangent stiffness matrix which, for hyperelastic materials and conservative loadings, is always symmetric (even for configurations far from the generalized equilibrium trajectory). Besides the total Lagrangian formulation, an updated version is also presented, which enables the treatment of rotations beyond the parameterization limit. An extension of the arc-length method that includes the generalized domain displacement fields, the generalized boundary tractions and the load parameter in the constraint equation of the hyper-ellipsis is proposed to solve the resulting nonlinear problem. Extending the hybrid-displacement formulation, a multi-region decomposition is proposed to handle complex geometries. A criterium for the classification of the equilibrium`s stability, based on the Bordered-Hessian matrix analysis, is suggested. Several numerical examples are presented, illustrating the effectiveness of the method. Differently from the standard finite element methods (FEM), the resulting solutions are (arbitrary) smooth generalized displacement and stress fields. (c) 2007 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of this work is to develop an improved model of the human thermal system. The features included are important to solve real problems: 3D heat conduction, the use of elliptical cylinders to adequately approximate body geometry, the careful representation of tissues and important organs, and the flexibility of the computational implementation. Focus is on the passive system, which is composed by 15 cylindrical elements and it includes heat transfer between large arteries and veins. The results of thermal neutrality and transient simulations are in excellent agreement with experimental data, indicating that the model represents adequately the behavior of the human thermal system. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The exact vibration modes and natural frequencies of planar structures and mechanisms, comprised Euler-Bernoulli beams, are obtained by solving a transcendental. nonlinear, eigenvalue problem stated by the dynamic stiffness matrix (DSM). To solve this kind of problem, the most employed technique is the Wittrick-Williams algorithm, developed in the early seventies. By formulating a new type of eigenvalue problem, which preserves the internal degrees-of-freedom for all members in the model, the present study offers an alternative to the use of this algorithm. The new proposed eigenvalue problem presents no poles, so the roots of the problem can be found by any suitable iterative numerical method. By avoiding a standard formulation for the DSM, the local mode shapes are directly calculated and any extension to the beam theory can be easily incorporated. It is shown that the method here adopted leads to exact solutions, as confirmed by various examples. Extensions of the formulation are also given, where rotary inertia, end release, skewed edges and rigid offsets are all included. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study, the concept of cellular automata is applied in an innovative way to simulate the separation of phases in a water/oil emulsion. The velocity of the water droplets is calculated by the balance of forces acting on a pair of droplets in a group, and cellular automata is used to simulate the whole group of droplets. Thus, it is possible to solve the problem stochastically and to show the sequence of collisions of droplets and coalescence phenomena. This methodology enables the calculation of the amount of water that can be separated from the emulsion under different operating conditions, thus enabling the process to be optimized. Comparisons between the results obtained from the developed model and the operational performance of an actual desalting unit are carried out. The accuracy observed shows that the developed model is a good representation of the actual process. (C) 2010 Published by Elsevier Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work introduces the problem of the best choice among M combinations of the shortest paths for dynamic provisioning of lightpaths in all-optical networks. To solve this problem in an optimized way (shortest path and load balance), a new fixed routing algorithm, named Best among the Shortest Routes (BSR), is proposed. The BSR`s performance is compared in terms of blocking probability and network utilization with Dijkstra`s shortest path algorithm and others algorithms proposed in the literature. The evaluated scenarios include several representative topologies for all-optical networking and different wavelength conversion architectures. For all studied scenarios, BSR achieved superior performance. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work aims at proposing the use of the evolutionary computation methodology in order to jointly solve the multiuser channel estimation (MuChE) and detection problems at its maximum-likelihood, both related to the direct sequence code division multiple access (DS/CDMA). The effectiveness of the proposed heuristic approach is proven by comparing performance and complexity merit figures with that obtained by traditional methods found in literature. Simulation results considering genetic algorithm (GA) applied to multipath, DS/CDMA and MuChE and multi-user detection (MuD) show that the proposed genetic algorithm multi-user channel estimation (GAMuChE) yields a normalized mean square error estimation (nMSE) inferior to 11%, under slowly varying multipath fading channels, large range of Doppler frequencies and medium system load, it exhibits lower complexity when compared to both maximum likelihood multi-user channel estimation (MLMuChE) and gradient descent method (GrdDsc). A near-optimum multi-user detector (MuD) based on the genetic algorithm (GAMuD), also proposed in this work, provides a significant reduction in the computational complexity when compared to the optimum multi-user detector (OMuD). In addition, the complexity of the GAMuChE and GAMuD algorithms were (jointly) analyzed in terms of number of operations necessary to reach the convergence, and compared to other jointly MuChE and MuD strategies. The joint GAMuChE-GAMuD scheme can be regarded as a promising alternative for implementing third-generation (3G) and fourth-generation (4G) wireless systems in the near future. Copyright (C) 2010 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hub-and-spoke networks are widely studied in the area of location theory. They arise in several contexts, including passenger airlines, postal and parcel delivery, and computer and telecommunication networks. Hub location problems usually involve three simultaneous decisions to be made: the optimal number of hub nodes, their locations and the allocation of the non-hub nodes to the hubs. In the uncapacitated single allocation hub location problem (USAHLP) hub nodes have no capacity constraints and non-hub nodes must be assigned to only one hub. In this paper, we propose three variants of a simple and efficient multi-start tabu search heuristic as well as a two-stage integrated tabu search heuristic to solve this problem. With multi-start heuristics, several different initial solutions are constructed and then improved by tabu search, while in the two-stage integrated heuristic tabu search is applied to improve both the locational and allocational part of the problem. Computational experiments using typical benchmark problems (Civil Aeronautics Board (CAB) and Australian Post (AP) data sets) as well as new and modified instances show that our approaches consistently return the optimal or best-known results in very short CPU times, thus allowing the possibility of efficiently solving larger instances of the USAHLP than those found in the literature. We also report the integer optimal solutions for all 80 CAB data set instances and the 12 AP instances up to 100 nodes, as well as for the corresponding new generated AP instances with reduced fixed costs. Published by Elsevier Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of the Boltzmann transform function, lambda(theta), to solve the Richards equation when the diffusivity, D, is a function of only soil water content,., is now commonplace in the literature. Nevertheless, a new analytic solution of the Boltzmann transform lambda(h) as a function of matric potential for horizontal water infiltration into a sand was derived without invoking the concept or use of D(theta). The derivation assumes that a similarity exists between the soil water retention function and the Boltzmann transform lambda(theta). The solution successfully described soil water content profiles experimentally measured for different infiltration times into a homogeneous sand and agrees with those presented by Philip in 1955 and 1957. The applicability of this solution for all soils remains open, but it is anticipated to hold for soils whose air-filled pore-size distribution before wetting is sufficiently narrow to yield a sharp increase of water content at the wetting front during infiltration. It also improves and provides a versatile alternative to the well-known analysis pioneered by Green and Ampt in 1911.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We model and calibrate the arguments in favor and against short-term and long-term debt. These arguments broadly include: maturity premium, sustainability, and service smoothing. We use a dynamic-equilibrium model with tax distortions and government outlays uncertainty, and model maturity as the fraction of debt that needs to be rolled over every period. In the model, the benefits of defaulting are tempered by higher future interest rates. We then calibrate our artificial economy and solve for the optimal debt maturity for Brazil as an example of a developing country and the US as an example of a mature economy. We obtain that the calibrated costs from defaulting on long-term debt more than offset costs associated with short-term debt. Therefore, short-term debt implies higher welfare levels.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose - The purpose of this paper is to discuss the economic crisis of 2008/2009 and the major impacts on developing nations and food-producing countries Within this macro-environment of food chains, there is concern that food inflation might come back sooner than expected The role of China as one of the major food consumers in the future, and Brazil, as the major food producer, is described as the food bridge, and an agenda of common development of these countries suggested. Design/methodology/approach - This paper reviews literature on muses of food inflation, production shortages, and investigation of programs to solve the problem in the future, it is also based on author`s personal insights and experience of working on this field in the last 15 years, and recent discussions in forums and interviews Findings - The major factors that jointly caused food prices increase in 2007/2008 were population growth, Income distribution, urbanization, dollar devaluations, commodity funds, social programs, production shortages, and bionic`s A list of ten policies is suggested. horizontal expansion of food production, vertical expansion, reduction in transaction costs, in protectionism and other taxes, investment in logistics, technology and better coordination, contracts, new generation of fertilizers and to use the best sources of biofuels. Originality/value - Two major outputs from this paper are the ""food demand model"" that inserts in one model the trends and muses of food inflation and the solutions, and the ""food bridge concept"" that also aligns in one box the imminent major food chain cooperation between China and Brazil

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper develops a multi-regional general equilibrium model for climate policy analysis based on the latest version of the MIT Emissions Prediction and Policy Analysis (EPPA) model. We develop two versions so that we can solve the model either as a fully inter-temporal optimization problem (forward-looking, perfect foresight) or recursively. The standard EPPA model on which these models are based is solved recursively, and it is necessary to simplify some aspects of it to make inter-temporal solution possible. The forward-looking capability allows one to better address economic and policy issues such as borrowing and banking of GHG allowances, efficiency implications of environmental tax recycling, endogenous depletion of fossil resources, international capital flows, and optimal emissions abatement paths among others. To evaluate the solution approaches, we benchmark each version to the same macroeconomic path, and then compare the behavior of the two versions under a climate policy that restricts greenhouse gas emissions. We find that the energy sector and CO(2) price behavior are similar in both versions (in the recursive version of the model we force the inter-temporal theoretical efficiency result that abatement through time should be allocated such that the CO(2) price rises at the interest rate.) The main difference that arises is that the macroeconomic costs are substantially lower in the forward-looking version of the model, since it allows consumption shifting as an additional avenue of adjustment to the policy. On the other hand, the simplifications required for solving the model as an optimization problem, such as dropping the full vintaging of the capital stock and fewer explicit technological options, likely have effects on the results. Moreover, inter-temporal optimization with perfect foresight poorly represents the real economy where agents face high levels of uncertainty that likely lead to higher costs than if they knew the future with certainty. We conclude that while the forward-looking model has value for some problems, the recursive model produces similar behavior in the energy sector and provides greater flexibility in the details of the system that can be represented. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Electrical impedance tomography is a technique to estimate the impedance distribution within a domain, based on measurements on its boundary. In other words, given the mathematical model of the domain, its geometry and boundary conditions, a nonlinear inverse problem of estimating the electric impedance distribution can be solved. Several impedance estimation algorithms have been proposed to solve this problem. In this paper, we present a three-dimensional algorithm, based on the topology optimization method, as an alternative. A sequence of linear programming problems, allowing for constraints, is solved utilizing this method. In each iteration, the finite element method provides the electric potential field within the model of the domain. An electrode model is also proposed (thus, increasing the accuracy of the finite element results). The algorithm is tested using numerically simulated data and also experimental data, and absolute resistivity values are obtained. These results, corresponding to phantoms with two different conductive materials, exhibit relatively well-defined boundaries between them, and show that this is a practical and potentially useful technique to be applied to monitor lung aeration, including the possibility of imaging a pneumothorax.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background There are multitudes of procedures in plastic surgery used to correct hypertrophic and pendulous breasts in patients with heavy and ptotic breasts who need great resections of breast tissue, where the suprasternal notch-to-nipple distance is long and the use of nipple-areola transposition techniques is a challenge for the plastic surgeon. The purpose of this study is to present a technique of reduction mammaplasty that could solve these problems based on the following principles: mammary reduction utilizing a thin superior medial pedicle (0.8-1.5 cm thick) and the resection performed in two steps: (1) the base excess at a plane perpendicular to the breast (this determines the cone`s height) and (2) central half keel (this determines the breast diameter reduction). Methods Ninety patients with mammary hypertrophy were operated on at the ""Hospital das Clinicas,"" Sao Paulo University Medical School, between January 2000 and November 2005. Inclusion in this study required a minimum of 12-cm change in nipple position and a 750-g breast resection. Results The mean change in nipple position was 16 cm (range = 12-21 cm). The mean weight of each breast was 1400 (range = 750-3000 g).Considering the great amount of volume removed and the size of the operated breasts, few complications were observed and were similar to those reported following other techniques described in the literature. Patient satisfaction following this procedure was high. Conclusion The results of this study clearly demonstrate that thin superior medial pedicle reduction mammaplasty is a safe and reliable technique in cases of severe mammary hypertrophy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The histopathological counterpart of white matter hyperintensities is a matter of debate. Methodological and ethical limitations have prevented this question to be elucidated. We want to introduce a protocol applying state-of-the-art methods in order to solve fundamental questions regarding the neuroimaging-neuropathological uncertainties comprising the most common white matter hyperintensities [WMHs] seen in aging. By this protocol, the correlation between signal features in in situ, post mortem MRI-derived methods, including DTI and MTR and quantitative and qualitative histopathology can be investigated. We are mainly interested in determining the precise neuroanatomical substrate of incipient WMHs. A major issue in this protocol is the exact co-registration of small lesion in a tridimensional coordinate system that compensates tissue deformations after histological processing. The protocol is based on four principles: post mortem MRI in situ performed in a short post mortem interval, minimal brain deformation during processing, thick serial histological sections and computer-assisted 3D reconstruction of the histological sections. This protocol will greatly facilitate a systematic study of the location, pathogenesis, clinical impact, prognosis and prevention of WMHs. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a positive correlation between the intensity of use of a given antibiotic and the prevalence of resistant strains. The more you treat, more patients infected with resistant strains appears and, as a consequence, the higher the mortality due to the infection and the longer the hospitalization time. In contrast, the less you treat, the higher the mortality rates and the longer the hospitalization time of patients infected with sensitive strains that could be successfully treated. The hypothesis proposed in this paper is an attempt to solve such a conflict: there must be an optimum treatment intensity that minimizes both the additional mortality and hospitalization time due to the infection by both sensitive and resistant bacteria strains. In order to test this hypothesis we applied a simple mathematical model that allowed us to estimate the optimum proportion of patients to be treated in order to minimize the total number of deaths and hospitalization time due to the infection in a hospital setting. (C) 2007 Elsevier Inc. All rights reserved.