983 resultados para Methods: numerical
Resumo:
Trabalho Final de Mestrado elaborado no Laboratório Nacional de Engenharia Civil (LNEC) para a obtenção do grau de Mestre em Engenharia Civil pelo Instituto Superior de Engenharia de Lisboa no âmbito do protocolo de Cooperação entre o ISEL e o LNEC
Resumo:
The influence of uncertainties of input parameters on output response of composite structures is investigated in this paper. In particular, the effects of deviations in mechanical properties, ply angles, ply thickness and on applied loads are studied. The uncertainty propagation and the importance measure of input parameters are analysed using three different approaches: a first-order local method, a Global Sensitivity Analysis (GSA) supported by a variance-based method and an extension of local variance to estimate the global variance over the domain of inputs. Sample results are shown for a shell composite laminated structure built with different composite systems including multi-materials. The importance measures of input parameters on structural response based on numerical results are established and discussed as a function of the anisotropy of composite materials. Needs for global variance methods are discussed by comparing the results obtained from different proposed methodologies. The objective of this paper is to contribute for the use of GSA techniques together with low expensive local importance measures.
Resumo:
OBJECTIVE To analyze temporal trends and distribution patterns of unsafe abortion in Brazil. METHODS Ecological study based on records of hospital admissions of women due to abortion in Brazil between 1996 and 2012, obtained from the Hospital Information System of the Ministry of Health. We estimated the number of unsafe abortions stratified by place of residence, using indirect estimate techniques. The following indicators were calculated: ratio of unsafe abortions/100 live births and rate of unsafe abortion/1,000 women of childbearing age. We analyzed temporal trends through polynomial regression and spatial distribution using municipalities as the unit of analysis. RESULTS In the study period, a total of 4,007,327 hospital admissions due to abortions were recorded in Brazil. We estimated a total of 16,905,911 unsafe abortions in the country, with an annual mean of 994,465 abortions (mean unsafe abortion rate: 17.0 abortions/1,000 women of childbearing age; ratio of unsafe abortions: 33.2/100 live births). Unsafe abortion presented a declining trend at national level (R2: 94.0%, p < 0.001), with unequal patterns between regions. There was a significant reduction of unsafe abortion in the Northeast (R2: 93.0%, p < 0.001), Southeast (R2: 92.0%, p < 0.001) and Central-West regions (R2: 64.0%, p < 0.001), whereas the North (R2: 39.0%, p = 0.030) presented an increase, and the South (R2: 22.0%, p = 0.340) remained stable. Spatial analysis identified the presence of clusters of municipalities with high values for unsafe abortion, located mainly in states of the North, Northeast and Southeast Regions. CONCLUSIONS Unsafe abortion remains a public health problem in Brazil, with marked regional differences, mainly concentrated in the socioeconomically disadvantaged regions of the country. Qualification of attention to women’s health, especially to reproductive aspects and attention to pre- and post-abortion processes, are necessary and urgent strategies to be implemented in the country.
Resumo:
Optimization problems arise in science, engineering, economy, etc. and we need to find the best solutions for each reality. The methods used to solve these problems depend on several factors, including the amount and type of accessible information, the available algorithms for solving them, and, obviously, the intrinsic characteristics of the problem. There are many kinds of optimization problems and, consequently, many kinds of methods to solve them. When the involved functions are nonlinear and their derivatives are not known or are very difficult to calculate, these methods are more rare. These kinds of functions are frequently called black box functions. To solve such problems without constraints (unconstrained optimization), we can use direct search methods. These methods do not require any derivatives or approximations of them. But when the problem has constraints (nonlinear programming problems) and, additionally, the constraint functions are black box functions, it is much more difficult to find the most appropriate method. Penalty methods can then be used. They transform the original problem into a sequence of other problems, derived from the initial, all without constraints. Then this sequence of problems (without constraints) can be solved using the methods available for unconstrained optimization. In this chapter, we present a classification of some of the existing penalty methods and describe some of their assumptions and limitations. These methods allow the solving of optimization problems with continuous, discrete, and mixing constraints, without requiring continuity, differentiability, or convexity. Thus, penalty methods can be used as the first step in the resolution of constrained problems, by means of methods that typically are used by unconstrained problems. We also discuss a new class of penalty methods for nonlinear optimization, which adjust the penalty parameter dynamically.
Resumo:
Penalty and Barrier methods are normally used to solve Nonlinear Optimization Problems constrained problems. The problems appear in areas such as engineering and are often characterised by the fact that involved functions (objective and constraints) are non-smooth and/or their derivatives are not know. This means that optimization methods based on derivatives cannot net used. A Java based API was implemented, including only derivative-free optimizationmethods, to solve both constrained and unconstrained problems, which includes Penalty and Barriers methods. In this work a new penalty function, based on Fuzzy Logic, is presented. This function imposes a progressive penalization to solutions that violate the constraints. This means that the function imposes a low penalization when the violation of the constraints is low and a heavy penalisation when the violation is high. The value of the penalization is not known in beforehand, it is the outcome of a fuzzy inference engine. Numerical results comparing the proposed function with two of the classic penalty/barrier functions are presented. Regarding the presented results one can conclude that the prosed penalty function besides being very robust also exhibits a very good performance.
Resumo:
The characteristics of carbon fibre reinforced laminates had widened their use, from aerospace to domestic appliances. A common characteristic is the need of drilling for assembly purposes. It is known that a drilling process that reduces the drill thrust force can decrease the risk of delamination. In this work, delamination assessment methods based on radiographic data are compared and correlated with mechanical test results (bearing test).
Resumo:
Constrained and unconstrained Nonlinear Optimization Problems often appear in many engineering areas. In some of these cases it is not possible to use derivative based optimization methods because the objective function is not known or it is too complex or the objective function is non-smooth. In these cases derivative based methods cannot be used and Direct Search Methods might be the most suitable optimization methods. An Application Programming Interface (API) including some of these methods was implemented using Java Technology. This API can be accessed either by applications running in the same computer where it is installed or, it can be remotely accessed through a LAN or the Internet, using webservices. From the engineering point of view, the information needed from the API is the solution for the provided problem. On the other hand, from the optimization methods researchers’ point of view, not only the solution for the problem is needed. Also additional information about the iterative process is useful, such as: the number of iterations; the value of the solution at each iteration; the stopping criteria, etc. In this paper are presented the features added to the API to allow users to access to the iterative process data.
Resumo:
In Nonlinear Optimization Penalty and Barrier Methods are normally used to solve Constrained Problems. There are several Penalty/Barrier Methods and they are used in several areas from Engineering to Economy, through Biology, Chemistry, Physics among others. In these areas it often appears Optimization Problems in which the involved functions (objective and constraints) are non-smooth and/or their derivatives are not know. In this work some Penalty/Barrier functions are tested and compared, using in the internal process, Derivative-free, namely Direct Search, methods. This work is a part of a bigger project involving the development of an Application Programming Interface, that implements several Optimization Methods, to be used in applications that need to solve constrained and/or unconstrained Nonlinear Optimization Problems. Besides the use of it in applied mathematics research it is also to be used in engineering software packages.
Resumo:
n the last decades the biocomposites have been widely used in the construction, automobile and aerospace industries. Not only the interface transition zone (ITZ) but also the heterogeneity of natural fibres affects the mechanical behaviour of these composites. This work focuses on the numerical and experimental analyses of a polymeric composite fabricated with epoxy resin and unidirectional sisal and banana fibres. A three-dimensional model was set to analyze the composites using the elastic properties of the individual phases. In addition, a two-dimensional model was set taking into account the effective composite properties obtained by micromechanical models. A tensile testing was performed to validate the numerical analyses and evaluating the interface condition of the constitutive phases.
Resumo:
Epidemiological studies have shown the effect of diet on the incidence of chronic diseases; however, proper planning, designing, and statistical modeling are necessary to obtain precise and accurate food consumption data. Evaluation methods used for short-term assessment of food consumption of a population, such as tracking of food intake over 24h or food diaries, can be affected by random errors or biases inherent to the method. Statistical modeling is used to handle random errors, whereas proper designing and sampling are essential for controlling biases. The present study aimed to analyze potential biases and random errors and determine how they affect the results. We also aimed to identify ways to prevent them and/or to use statistical approaches in epidemiological studies involving dietary assessments.
Resumo:
OBJECTIVE To assess the validity of dengue fever reports and how they relate to the definition of case and severity. METHODS Diagnostic test assessment was conducted using cross-sectional sampling from a universe of 13,873 patients treated during the fifth epidemiological period in health institutions from 11 Colombian departments in 2013. The test under analyses was the reporting to the National Public Health Surveillance System, and the reference standard was the review of histories identified by active institutional search. We reviewed all histories of patients diagnosed with dengue fever, as well as a random sample of patients with febrile syndromes. The specificity and sensitivity of reports were estimated for this purpose, considering the inverse of the probability of being selected for weighting. The concordance between reporting and the findings of the active institutional search was calculated using Kappa statistics. RESULTS We included 4,359 febrile patients, and 31.7% were classified as compatible with dengue fever (17 with severe dengue fever; 461 with dengue fever and warning signs; 904 with dengue fever and no warning signs). The global sensitivity of reports was 13.2% (95%CI 10.9;15.4) and specificity was 98.4% (95%CI 97.9;98.9). Sensitivity varied according to severity: 12.1% (95%CI 9.3;14.8) for patients presenting dengue fever with no warning signs; 14.5% (95%CI 10.6;18.4) for those presenting dengue fever with warning signs, and 40.0% (95%CI 9.6;70.4) for those with severe dengue fever. Concordance between reporting and the findings of the active institutional search resulted in a Kappa of 10.1%. CONCLUSIONS Low concordance was observed between reporting and the review of clinical histories, which was associated with the low reporting of dengue fever compatible cases, especially milder cases.
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do grau de Mestre em Engenharia do Ambiente
Resumo:
In this paper an algorithm for the calculation of the root locus of fractional linear systems is presented. The proposed algorithm takes advantage of present day computational resources and processes directly the characteristic equation, avoiding the limitations revealed by standard methods. The results demonstrate the good performance for different types of expressions.
Resumo:
OBJECTIVE To analyze the cases of tuberculosis and the impact of direct follow-up on the assessment of treatment outcomes.METHODS This open prospective cohort study evaluated 504 cases of tuberculosis reported in the Sistema de Informação de Agravos de Notificação (SINAN – Notifiable Diseases Information System) in Juiz de Fora, MG, Southeastern Brazil, between 2008 and 2009. The incidence of treatment outcomes was compared between a group of patients diagnosed with tuberculosis and directly followed up by monthly consultations during return visits (287) and a patient group for which the information was indirectly collected (217) through the city’s surveillance system. The Chi-square test was used to compare the percentages, with a significance level of 0.05. The relative risk (RR) was used to evaluate the differences in the incidence rate of each type of treatment outcome between the two groups.RESULTS Of the outcomes directly and indirectly evaluated, 18.5% and 3.2% corresponded to treatment default and 3.8% and 0.5% corresponded to treatment failure, respectively. The incidence of treatment default and failure was higher in the group with direct follow-up (p < 0.05) (RR = 5.72, 95%CI 2.65;12.34, and RR = 8.31, 95%CI 1.08;63.92, respectively).CONCLUSIONS A higher incidence of treatment default and failure was observed in the directly followed up group, and most of these cases were neglected by the disease reporting system. Therefore, effective measures are needed to improve the control of tuberculosis and data quality.
Resumo:
Adhesive bonding as a joining or repair method has a wide application in many industries. Repairs with bonded patches are often carried out to re-establish the stiffness at critical regions or spots of corrosion and/or fatigue cracks. Single and double-strap repairs (SS and DS, respectively) are a viable option for repairing. For the SS repairs, a patch is adhesively-bonded on one of the structure faces. SS repairs are easy to execute, but the load eccentricity leads to peel peak stresses at the overlap edges. DS repairs involve the use of two patches, one on each face of the structure. These are more efficient than SS repairs, due to the doubling of the bonding area and suppression of the transverse deflection of the adherends. Shear stresses also become more uniform as a result of smaller differential straining. The experimental and Finite Element (FE) study presented here for strength prediction and design optimization of bonded repairs includes SS and DS solutions with different values of overlap length (LO). The examined values of LO include 10, 20 and 30 mm. The failure strengths of the SS and DS repairs were compared with FE results by using the Abaqus® FE software. A Cohesive Zone Model (CZM) with a triangular shape in pure tensile and shear modes, including the mixed-mode possibility for crack growth, was used to simulate fracture of the adhesive layer. A good agreement was found between the experiments and the FE simulations on the failure modes, elastic stiffness and strength of the repairs, showing the effectiveness and applicability of the proposed FE technique in predicting strength of bonded repairs. Furthermore, some optimization principles were proposed to repair structures with adhesively-bonded patches that will allow repair designers to effectively design bonded repairs.