31 resultados para Solving problems
Resumo:
We consider a class of two-dimensional problems in classical linear elasticity for which material overlapping occurs in the absence of singularities. Of course, material overlapping is not physically realistic, and one possible way to prevent it uses a constrained minimization theory. In this theory, a minimization problem consists of minimizing the total potential energy of a linear elastic body subject to the constraint that the deformation field must be locally invertible. Here, we use an interior and an exterior penalty formulation of the minimization problem together with both a standard finite element method and classical nonlinear programming techniques to compute the minimizers. We compare both formulations by solving a plane problem numerically in the context of the constrained minimization theory. The problem has a closed-form solution, which is used to validate the numerical results. This solution is regular everywhere, including the boundary. In particular, we show numerical results which indicate that, for a fixed finite element mesh, the sequences of numerical solutions obtained with both the interior and the exterior penalty formulations converge to the same limit function as the penalization is enforced. This limit function yields an approximate deformation field to the plane problem that is locally invertible at all points in the domain. As the mesh is refined, this field converges to the exact solution of the plane problem.
Resumo:
This work presents an analysis of the wavelet-Galerkin method for one-dimensional elastoplastic-damage problems. Time-stepping algorithm for non-linear dynamics is presented. Numerical treatment of the constitutive models is developed by the use of return-mapping algorithm. For spacial discretization we can use wavelet-Galerkin method instead of standard finite element method. This approach allows to locate singularities. The discrete formulation developed can be applied to the simulation of one-dimensional problems for elastic-plastic-damage models. (C) 2007 Elsevier Inc. All rights reserved.
Resumo:
This paper presents a new methodology to estimate harmonic distortions in a power system, based on measurements of a limited number of given sites. The algorithm utilizes evolutionary strategies (ES), a development branch of evolutionary algorithms. The main advantage in using such a technique relies upon its modeling facilities as well as its potential to solve fairly complex problems. The problem-solving algorithm herein proposed makes use of data from various power-quality (PQ) meters, which can either be synchronized by high technology global positioning system devices or by using information from a fundamental frequency load flow. This second approach makes the overall PQ monitoring system much less costly. The algorithm is applied to an IEEE test network, for which sensitivity analysis is performed to determine how the parameters of the ES can be selected so that the algorithm performs in an effective way. Case studies show fairly promising results and the robustness of the proposed method.
Resumo:
In the present paper the dynamic solutions of two non-steady seepage problems are discussed. It is shown that the acceleration term in the equation of motion is important for a correct qualitative description of the flow.
Resumo:
This article presents a tool for the allocation analysis of complex systems of water resources, called AcquaNetXL, developed in the form of spreadsheet in which a model of linear optimization and another nonlinear were incorporated. The AcquaNetXL keeps the concepts and attributes of a decision support system. In other words, it straightens out the communication between the user and the computer, facilitates the understanding and the formulation of the problem, the interpretation of the results and it also gives a support in the process of decision making, turning it into a clear and organized process. The performance of the algorithms used for solving the problems of water allocation was satisfactory especially for the linear model.
Resumo:
Tailoring specified vibration modes is a requirement for designing piezoelectric devices aimed at dynamic-type applications. A technique for designing the shape of specified vibration modes is the topology optimization method (TOM) which finds an optimum material distribution inside a design domain to obtain a structure that vibrates according to specified eigenfrequencies and eigenmodes. Nevertheless, when the TOM is applied to dynamic problems, the well-known grayscale or intermediate material problem arises which can invalidate the post-processing of the optimal result. Thus, a more natural way for solving dynamic problems using TOM is to allow intermediate material values. This idea leads to the functionally graded material (FGM) concept. In fact, FGMs are materials whose properties and microstructure continuously change along a specific direction. Therefore, in this paper, an approach is presented for tailoring user-defined vibration modes, by applying the TOM and FGM concepts to design functionally graded piezoelectric transducers (FGPT) and non-piezoelectric structures (functionally graded structures-FGS) in order to achieve maximum and/or minimum vibration amplitudes at certain points of the structure, by simultaneously finding the topology and material gradation function. The optimization problem is solved by using sequential linear programming. Two-dimensional results are presented to illustrate the method.
Resumo:
Hub-and-spoke networks are widely studied in the area of location theory. They arise in several contexts, including passenger airlines, postal and parcel delivery, and computer and telecommunication networks. Hub location problems usually involve three simultaneous decisions to be made: the optimal number of hub nodes, their locations and the allocation of the non-hub nodes to the hubs. In the uncapacitated single allocation hub location problem (USAHLP) hub nodes have no capacity constraints and non-hub nodes must be assigned to only one hub. In this paper, we propose three variants of a simple and efficient multi-start tabu search heuristic as well as a two-stage integrated tabu search heuristic to solve this problem. With multi-start heuristics, several different initial solutions are constructed and then improved by tabu search, while in the two-stage integrated heuristic tabu search is applied to improve both the locational and allocational part of the problem. Computational experiments using typical benchmark problems (Civil Aeronautics Board (CAB) and Australian Post (AP) data sets) as well as new and modified instances show that our approaches consistently return the optimal or best-known results in very short CPU times, thus allowing the possibility of efficiently solving larger instances of the USAHLP than those found in the literature. We also report the integer optimal solutions for all 80 CAB data set instances and the 12 AP instances up to 100 nodes, as well as for the corresponding new generated AP instances with reduced fixed costs. Published by Elsevier Ltd.
Resumo:
Objective To evaluate the efficiency of pharmaceutical care on the control of clinical parameters, such as fasting glycaemia and glycosylated haemoglobin in patients with Type 2 Diabetes mellitus. Setting This study was conducted at the Training and Community Health Centre of the College of Medicine of Ribeirao Preto, University of Sao Paulo, Brazil. Methods A prospective and experimental study was conducted with 71 participants divided in two groups: (i) pharmaceutical care group (n=40), and (ii) the control group (n=31). The distribution of patients within these groups was made casually, and the patients were monitored for 12 months. Main outcome measure: Values for fasting glycaemia and glycosylated haemoglobin were collected. Results Mean values of fasting glycaemia in the pharmaceutical care group were significantly reduced whilst a small reduction was detected in the control group at the same time. A significant reduction in the levels of glycosylated haemoglobin was detected in patients in the pharmaceutical care group, and an average increase was observed in the control group. Furthermore, the follow-up of the intervention group by a pharmacist contributed to the resolution of 62.7% of 142 drug therapy problems identified. Conclusion In Brazil, the information provided by a pharmacist to patients with Type 2 Diabetes mellitus increases compliance to treatment, solving or reducing the Drug Therapy Problem and, consequently, improving glycaemic control.
Resumo:
This article intends to rationally reconstruct Locke`s theory of knowledge as incorporated in a research program concerning the nature and structure of the theories and models of rationality. In previous articles we argued that the rationalist program can be subdivided into the classical rationalistic subprogram, which includes the knowledge theories of Descartes, Locke, Hume and Kant, the neoclassical subprogram, which includes the approaches of Duhem, Poincare and Mach, and the critical subprogram of Popper. The subdivision results from the different views of rationality proposed by each one of these subprograms, as well as from the tools made available by each one of them, containing theoretical instruments used to arrange, organize and develop the discussion on rationality, the main one of which is the structure of solution of problems. In this essay we intend to reconstruct the assumptions of Locke`s theory of knowledge, which in our view belongs to the classical rationalistic subprogram because it shares with it the thesis of the identity of (scientific) knowledge and certain knowledge.
Resumo:
Managing financial institutions in an underdeveloped economic context has become a real challenge nowadays. In order to reach the organization`s planned goals, they have to deal with structural, behavioral and informational problems. From the systemic point of view, this situation gets even worse when the company does not present organizational boundaries and a cohesive identification for their stakeholders. Thus, European countries have some special financial lines in order to help the development of micro credit in Latin communities in an attempt to help the local economy. However, institutions like Caixa dos Andes in Peru present management problems when dealing with this complexity. Based on this, how can the systemic eye help in the diagnosis of soft problems of a Peruvian financial company? This study aims to diagnose soft problems of a Peruvian financial company based on soft variables like identity, communication and autonomy and also intends to identify possible ways to redesign its basic framework. The (VSM--Viable System Model) method from Beer (1967), applied in this diagnostic study, was used in a practical way as a management tool for organizations` analysis and planning. By describing the VSM`s five systems, the creation of a systemic vision or a total vision is possible, showing the organization`s complexity from the inside. Some company`s soft problems like double control, inefficient use of physical and human resources, low information flows, slowness, etc. The VSM presented an organizational diagnosis indicating effective solutions that do integrate its five systems.
Resumo:
This paper develops a multi-regional general equilibrium model for climate policy analysis based on the latest version of the MIT Emissions Prediction and Policy Analysis (EPPA) model. We develop two versions so that we can solve the model either as a fully inter-temporal optimization problem (forward-looking, perfect foresight) or recursively. The standard EPPA model on which these models are based is solved recursively, and it is necessary to simplify some aspects of it to make inter-temporal solution possible. The forward-looking capability allows one to better address economic and policy issues such as borrowing and banking of GHG allowances, efficiency implications of environmental tax recycling, endogenous depletion of fossil resources, international capital flows, and optimal emissions abatement paths among others. To evaluate the solution approaches, we benchmark each version to the same macroeconomic path, and then compare the behavior of the two versions under a climate policy that restricts greenhouse gas emissions. We find that the energy sector and CO(2) price behavior are similar in both versions (in the recursive version of the model we force the inter-temporal theoretical efficiency result that abatement through time should be allocated such that the CO(2) price rises at the interest rate.) The main difference that arises is that the macroeconomic costs are substantially lower in the forward-looking version of the model, since it allows consumption shifting as an additional avenue of adjustment to the policy. On the other hand, the simplifications required for solving the model as an optimization problem, such as dropping the full vintaging of the capital stock and fewer explicit technological options, likely have effects on the results. Moreover, inter-temporal optimization with perfect foresight poorly represents the real economy where agents face high levels of uncertainty that likely lead to higher costs than if they knew the future with certainty. We conclude that while the forward-looking model has value for some problems, the recursive model produces similar behavior in the energy sector and provides greater flexibility in the details of the system that can be represented. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Background: Studies investigating the association between alcohol use and cognitive disorders in the elderly population have produced divergent results. Moreover, the role of alcohol in cognitive dysfunction is not clear. The aims of this study were to estimate the prevalence of alcohol-related problems in an elderly population from Brazil and to investigate their association with cognitive and functional impairment (CFI) and dementia. Methods: A community-based cross-sectional study was performed. A sample of 1,145 elderly people was examined in 2 phases. Several instruments were utilized in the first phase: the CAGE questionnaire was used to identify potential cases of alcohol-related problems, and a screening test for dementia was used to estimate CFI. The CAMDEX interview (Cambridge Examination) and DSM-IV (Diagnostic and Statistical Manual of Mental Disorders, 4th edition) criteria were used for the clinical diagnosis of dementia in the second phase. Results: ""Heavy alcohol use"" (CAGE >= 2) was found in 92 subjects (prevalence: 8.2%). It was associated with gender (males, p < 0.001), low education (only in females, p = 0.002), and low socioeconomic level (p = 0.001, in females; p = 0.002, in males). The Mini Mental State Examination exhibited a nonlinear relationship with alcohol-related problems in females; ""mild-moderate alcohol use"" (CAGE < 2) presented the highest score. A significant association between alcohol-related problems and cognitive dysfunction was found only in females. ""Heavy alcohol use"" was associated with higher CFI and dementia rates compared to ""mild-moderate alcohol use"" (p = 0.003 and p < 0.001, respectively). ""Mild-moderate alcohol use"" had a tendency of association with lower CFI and dementia rates when compared to ""no alcohol use"" (p = 0.063 and 0.050, respectively). Conclusion: Our findings suggest that alcohol use does not have a linear relationship with cognitive decline.
Resumo:
Objective: To evaluate the usefulness of gamma-glutamyltransferase (GGT) and mean corpuscular volume (MCV), as well as that of the CAGE questionnaire, in workplace screening for alcohol abuse/dependence. Methods: A total of 183 male employees were submitted to structured interviews (Structured Clinical Interview for DSM-IV 2.0 and CAGE questionnaire). Blood samples were collected. Diagnostic accuracy and odds ratio were determined for the CAGE, GGT and MCV. Results: The CAGE questionnaire presented the best sensitivity for alcohol dependence (91%; specificity, 87.8%) and for alcohol abuse (87.5%, specificity, 80.9%), which increased when the questionnaire was used in combination with GGT (sensitivity, 100% and 87.5%, respectively; specificity, 68% and 61.5, respectively). CAGE positive results and/or alterations in GGT were less likely to occur among employees not presenting alcohol abuse/ dependence than among those presenting such abuse (OR for CAGE = 13, p < 0.05; OR for CAGE-GGT = 11, p < 0.05) or dependence (OR for CAGE = 76, p < 0.0 1; OR for GGT = 5, p < 0.0 1). Employees not presenting alcohol abuse/dependence were also several times more likely to present negative CAGE or GGT results. Conclusions: The use short, simple questionnaires, combined with that of low-cost biochemical markers, such as GGT, can serve as an initial screening for alcohol-related problems, especially for employees in hazardous occupations. The data provided can serve to corroborate clinical findings. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
In this study, the effectiveness of a group-based attention and problem solving (APS) treatment approach to executive impairments in patients with frontal lobe lesions was investigated. Thirty participants with lesions in the frontal lobes, 16 with left frontal (LF) and 14 with right frontal (RF) lesions, were allocated into three groups, each with 10 participants. The APS treatment was initially compared to two other control conditions, an information/education (IE) approach and treatment-as-usual or traditional rehabilitation (TR), with each of the control groups subsequently receiving the APS intervention in a crossover design. This design allowed for an evaluation of the treatment through assessment before and after treatment and on follow up, six months later. There was an improvement on some executive and functional measures after the implementation of the APS programme in the three groups. Size, and to a lesser extent laterality, of lesion affected baseline performance on measures of executive function, but there was no apparent relationship between size, laterality or site of lesion and level of benefit from the treatment intervention. The results were discussed in terms of models of executive functioning and the effectiveness of domain specific interventions in the rehabilitation of executive dysfunction.
Resumo:
Immunological systems have been an abundant inspiration to contemporary computer scientists. Problem solving strategies, stemming from known immune system phenomena, have been successfully applied to chall enging problems of modem computing. Simulation systems and mathematical modeling are also beginning use to answer more complex immunological questions as immune memory process and duration of vaccines, where the regulation mechanisms are not still known sufficiently (Lundegaard, Lund, Kesmir, Brunak, Nielsen, 2007). In this article we studied in machina a approach to simulate the process of antigenic mutation and its implications for the process of memory. Our results have suggested that the durability of the immune memory is affected by the process of antigenic mutation.and by populations of soluble antibodies in the blood. The results also strongly suggest that the decrease of the production of antibodies favors the global maintenance of immune memory.