972 resultados para Regression method


Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE To analyze cervical and breast cancer mortality in Brazil according to socioeconomic and welfare indicators. METHODS Data on breast and cervical cancer mortality covering a 30-year period (1980-2010) were analyzed. The data were obtained from the National Mortality Database, population data from the Brazilian Institute of Geography and Statistics database, and socioeconomic and welfare information from the Institute of Applied Economic Research. Moving averages were calculated, disaggregated by capital city and municipality. The annual percent change in mortality rates was estimated by segmented linear regression using the joinpoint method. Pearson’s correlation coefficients were conducted between average mortality rate at the end of the three-year period and selected indicators in the state capital and each Brazilian state. RESULTS There was a decline in cervical cancer mortality rates throughout the period studied, except in municipalities outside of the capitals in the North and Northeast. There was a decrease in breast cancer mortality in the capitals from the end of the 1990s onwards. Favorable socioeconomic indicators were inversely correlated with cervical cancer mortality. A strong direct correlation was found with favorable indicators and an inverse correlation with fertility rate and breast cancer mortality in inner cities. CONCLUSIONS There is an ongoing dynamic process of increased risk of cervical and breast cancer and attenuation of mortality because of increased, albeit unequal, access to and provision of screening, diagnosis and treatment. 

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE To propose a method of redistributing ill-defined causes of death (IDCD) based on the investigation of such causes.METHODS In 2010, an evaluation of the results of investigating the causes of death classified as IDCD in accordance with chapter 18 of the International Classification of Diseases (ICD-10) by the Mortality Information System was performed. The redistribution coefficients were calculated according to the proportional distribution of ill-defined causes reclassified after investigation in any chapter of the ICD-10, except for chapter 18, and used to redistribute the ill-defined causes not investigated and remaining by sex and age. The IDCD redistribution coefficient was compared with two usual methods of redistribution: a) Total redistribution coefficient, based on the proportional distribution of all the defined causes originally notified and b) Non-external redistribution coefficient, similar to the previous, but excluding external causes.RESULTS Of the 97,314 deaths by ill-defined causes reported in 2010, 30.3% were investigated, and 65.5% of those were reclassified as defined causes after the investigation. Endocrine diseases, mental disorders, and maternal causes had a higher representation among the reclassified ill-defined causes, contrary to infectious diseases, neoplasms, and genitourinary diseases, with higher proportions among the defined causes reported. External causes represented 9.3% of the ill-defined causes reclassified. The correction of mortality rates by the total redistribution coefficient and non-external redistribution coefficient increased the magnitude of the rates by a relatively similar factor for most causes, contrary to the IDCD redistribution coefficient that corrected the different causes of death with differentiated weights.CONCLUSIONS The proportional distribution of causes among the ill-defined causes reclassified after investigation was not similar to the original distribution of defined causes. Therefore, the redistribution of the remaining ill-defined causes based on the investigation allows for more appropriate estimates of the mortality risk due to specific causes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A simple procedure to measure the cohesive laws of bonded joints under mode I loading using the double cantilever beam test is proposed. The method only requires recording the applied load–displacement data and measuring the crack opening displacement at its tip in the course of the experimental test. The strain energy release rate is obtained by a procedure involving the Timoshenko beam theory, the specimen’s compliance and the crack equivalent concept. Following the proposed approach the influence of the fracture process zone is taken into account which is fundamental for an accurate estimation of the failure process details. The cohesive law is obtained by differentiation of the strain energy release rate as a function of the crack opening displacement. The model was validated numerically considering three representative cohesive laws. Numerical simulations using finite element analysis including cohesive zone modeling were performed. The good agreement between the inputted and resulting laws for all the cases considered validates the model. An experimental confirmation was also performed by comparing the numerical and experimental load–displacement curves. The numerical load–displacement curves were obtained by adjusting typical cohesive laws to the ones measured experimentally following the proposed approach and using finite element analysis including cohesive zone modeling. Once again, good agreement was obtained in the comparisons thus demonstrating the good performance of the proposed methodology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Constraints nonlinear optimization problems can be solved using penalty or barrier functions. This strategy, based on solving the problems without constraints obtained from the original problem, have shown to be e ective, particularly when used with direct search methods. An alternative to solve the previous problems is the lters method. The lters method introduced by Fletcher and Ley er in 2002, , has been widely used to solve problems of the type mentioned above. These methods use a strategy di erent from the barrier or penalty functions. The previous functions de ne a new one that combine the objective function and the constraints, while the lters method treat optimization problems as a bi-objective problems that minimize the objective function and a function that aggregates the constraints. Motivated by the work of Audet and Dennis in 2004, using lters method with derivative-free algorithms, the authors developed works where other direct search meth- ods were used, combining their potential with the lters method. More recently. In a new variant of these methods was presented, where it some alternative aggregation restrictions for the construction of lters were proposed. This paper presents a variant of the lters method, more robust than the previous ones, that has been implemented with a safeguard procedure where values of the function and constraints are interlinked and not treated completely independently.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Constrained nonlinear optimization problems are usually solved using penalty or barrier methods combined with unconstrained optimization methods. Another alternative used to solve constrained nonlinear optimization problems is the lters method. Filters method, introduced by Fletcher and Ley er in 2002, have been widely used in several areas of constrained nonlinear optimization. These methods treat optimization problem as bi-objective attempts to minimize the objective function and a continuous function that aggregates the constraint violation functions. Audet and Dennis have presented the rst lters method for derivative-free nonlinear programming, based on pattern search methods. Motivated by this work we have de- veloped a new direct search method, based on simplex methods, for general constrained optimization, that combines the features of the simplex method and lters method. This work presents a new variant of these methods which combines the lters method with other direct search methods and are proposed some alternatives to aggregate the constraint violation functions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Adhesive-bonding for the unions in multi-component structures is gaining momentum over welding, riveting and fastening. It is vital for the design of bonded structures the availability of accurate damage models, to minimize design costs and time to market. Cohesive Zone Models (CZM’s) have been used for fracture prediction in structures. The eXtended Finite Element Method (XFEM) is a recent improvement of the Finite Element Method (FEM) that relies on traction-separation laws similar to those of CZM’s but it allows the growth of discontinuities within bulk solids along an arbitrary path, by enriching degrees of freedom. This work proposes and validates a damage law to model crack propagation in a thin layer of a structural epoxy adhesive using the XFEM. The fracture toughness in pure mode I (GIc) and tensile cohesive strength (sn0) were defined by Double-Cantilever Beam (DCB) and bulk tensile tests, respectively, which permitted to build the damage law. The XFEM simulations of the DCB tests accurately matched the experimental load-displacement (P-d) curves, which validated the analysis procedure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the Pseudo phase plane (PPP) method for detecting the existence of a nanofilm on the nitroazobenzene-modified glassy carbon electrode (NAB-GC) system. This modified electrode systems and nitroazobenze-nanofilm were prepared by the electrochemical reduction of diazonium salt of NAB at the glassy carbon electrodes (GCE) in nonaqueous media. The IR spectra of the bare glassy carbon electrodes (GCE), the NAB-GC electrode system and the organic NAB film were recorded. The IR data of the bare GC, NAB-GC and NAB film were categorized into five series consisting of FILM1, GC-NAB1, GC1; FILM2, GC-NAB2, GC2; FILM3, GC-NAB3, GC3 and FILM4, GC-NAB4, GC4 respectively. The PPP approach was applied to each group of the data of unmodified and modified electrode systems with nanofilm. The results provided by PPP method show the existence of the NAB film on the modified GC electrode.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, the shear modulus and strength of the acrylic adhesive 3M® DP 8005 was evaluated by two different methods: the Thick Adherend Shear Test (TAST) and the Notched Plate Shear Method (Arcan). However, TAST standards advise the use of a special extensometer attached to the specimen, which requires a very experienced technician. In the present study, the adhesive shear displacement for the TAST was measured using an optical technique, and also with a conventional inductive extensometer of 25 mm used for tensile tests. This allowed for an assessment of suitability of using a conventional extensometer to measure this parameter. Since the results obtained by the two techniques are identical, it can be concluded that using a conventional extensometer is a valid option to obtain the shear modulus for the particular adhesive used. In the Arcan tests, the adhesive shear displacement was only measured using the optical technique. This work also aimed the comparison of shear modulus and strength obtained by the TAST and Arcan test methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is important to understand and forecast a typical or a particularly household daily consumption in order to design and size suitable renewable energy systems and energy storage. In this research for Short Term Load Forecasting (STLF) it has been used Artificial Neural Networks (ANN) and, despite the consumption unpredictability, it has been shown the possibility to forecast the electricity consumption of a household with certainty. The ANNs are recognized to be a potential methodology for modeling hourly and daily energy consumption and load forecasting. Input variables such as apartment area, numbers of occupants, electrical appliance consumption and Boolean inputs as hourly meter system were considered. Furthermore, the investigation carried out aims to define an ANN architecture and a training algorithm in order to achieve a robust model to be used in forecasting energy consumption in a typical household. It was observed that a feed-forward ANN and the Levenberg-Marquardt algorithm provided a good performance. For this research it was used a database with consumption records, logged in 93 real households, in Lisbon, Portugal, between February 2000 and July 2001, including both weekdays and weekend. The results show that the ANN approach provides a reliable model for forecasting household electric energy consumption and load profile. © 2014 The Author.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE To analyze the spatial distribution of risk for tuberculosis and its socioeconomic determinants in the city of Rio de Janeiro, Brazil.METHODS An ecological study on the association between the mean incidence rate of tuberculosis from 2004 to 2006 and socioeconomic indicators of the Censo Demográfico (Demographic Census) of 2000. The unit of analysis was the home district registered in the Sistema de Informação de Agravos de Notificação (Notifiable Diseases Information System) of Rio de Janeiro, Southeastern Brazil. The rates were standardized by sex and age group, and smoothed by the empirical Bayes method. Spatial autocorrelation was evaluated by Moran’s I. Multiple linear regression models were studied and the appropriateness of incorporating the spatial component in modeling was evaluated.RESULTS We observed a higher risk of the disease in some neighborhoods of the port and north regions, as well as a high incidence in the slums of Rocinha and Vidigal, in the south region, and Cidade de Deus, in the west. The final model identified a positive association for the variables: percentage of permanent private households in which the head of the house earns three to five minimum wages; percentage of individual residents in the neighborhood; and percentage of people living in homes with more than two people per bedroom.CONCLUSIONS The spatial analysis identified areas of risk of tuberculosis incidence in the neighborhoods of the city of Rio de Janeiro and also found spatial dependence for the incidence of tuberculosis and some socioeconomic variables. However, the inclusion of the space component in the final model was not required during the modeling process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE To analyze if dietary patterns during the third gestational trimester are associated with birth weight.METHODS Longitudinal study conducted in the cities of Petropolis and Queimados, Rio de Janeiro (RJ), Southeastern Brazil, between 2007 and 2008. We analyzed data from the first and second follow-up wave of a prospective cohort. Food consumption of 1,298 pregnant women was assessed using a semi-quantitative questionnaire about food frequency. Dietary patterns were obtained by exploratory factor analysis, using the Varimax rotation method. We also applied the multivariate linear regression model to estimate the association between food consumption patterns and birth weight.RESULTS Four patterns of consumption – which explain 36.4% of the variability – were identified and divided as follows: (1) prudent pattern (milk, yogurt, cheese, fruit and fresh-fruit juice, cracker, and chicken/beef/fish/liver), which explained 14.9% of the consumption; (2) traditional pattern, consisting of beans, rice, vegetables, breads, butter/margarine and sugar, which explained 8.8% of the variation in consumption; (3) Western pattern (potato/cassava/yams, macaroni, flour/farofa/grits, pizza/hamburger/deep fried pastries, soft drinks/cool drinks and pork/sausages/egg), which accounts for 6.9% of the variance; and (4) snack pattern (sandwich cookie, salty snacks, chocolate, and chocolate drink mix), which explains 5.7% of the consumption variability. The snack dietary pattern was positively associated with birth weight (β = 56.64; p = 0.04) in pregnant adolescents.CONCLUSIONS For pregnant adolescents, the greater the adherence to snack pattern during pregnancy, the greater the baby’s birth weight.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE To estimate the incidence and predicting factors associated with falls among older inpatients.METHODS Prospective cohort study conducted in clinical units of three hospitals in Cuiaba, MT, Midwestern Brazil, from March to August 2013. In this study, 221 inpatients aged 60 or over were followed until hospital discharge, death, or fall. The method of incidence density was used to calculate incidence rates. Bivariate analysis was performed by Chi-square test, and multiple analysis was performed by Cox regression.RESULTS The incidence of falls was 12.6 per 1,000 patients/day. Predicting factors for falls during hospitalization were: low educational level (RR = 2.48; 95%CI 1.17;5.25), polypharmacy (RR = 4.42; 95%CI 1.77;11.05), visual impairment (RR = 2.06; 95%CI 1.01;4.23), gait and balance impairment (RR = 2.95; 95%CI 1.22;7.14), urinary incontinence (RR = 5.67; 95%CI 2.58;12.44) and use of laxatives (RR = 4.21; 95%CI 1.15;15.39) and antipsychotics (RR = 4.10; 95%CI 1.38;12.13).CONCLUSIONS The incidence of falls of older inpatients is high. Predicting factors found for falls were low education level, polypharmacy, visual impairment, gait and balance impairment, urinary incontinence and use of laxatives and antipsychotics. Measures to prevent falls in hospitals are needed to reduce the incidence of this event.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a methodology for applying scheduling algorithms using Monte Carlo simulation. The methodology is based on a decision support system (DSS). The proposed methodology combines a genetic algorithm with a new local search using Monte Carlo Method. The methodology is applied to the job shop scheduling problem (JSSP). The JSSP is a difficult problem in combinatorial optimization for which extensive investigation has been devoted to the development of efficient algorithms. The methodology is tested on a set of standard instances taken from the literature and compared with others. The computation results validate the effectiveness of the proposed methodology. The DSS developed can be utilized in a common industrial or construction environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article, we calibrate the Vasicek interest rate model under the risk neutral measure by learning the model parameters using Gaussian processes for machine learning regression. The calibration is done by maximizing the likelihood of zero coupon bond log prices, using mean and covariance functions computed analytically, as well as likelihood derivatives with respect to the parameters. The maximization method used is the conjugate gradients. The only prices needed for calibration are zero coupon bond prices and the parameters are directly obtained in the arbitrage free risk neutral measure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses the problem of optimal positioning of surface bonded piezoelectric patches in sandwich plates with viscoelastic core and laminated face layers. The objective is to maximize a set of modal loss factors for a given frequency range using multiobjective topology optimization. Active damping is introduced through co-located negative velocity feedback control. The multiobjective topology optimization problem is solved using the Direct MultiSearch Method. An application to a simply supported sandwich plate is presented with results for the maximization of the first six modal loss factors. The influence of the finite element mesh is analyzed and the results are, to some extent, compared with those obtained using alternative single objective optimization.