818 resultados para Robust Regression
Resumo:
Introduction: The Charlson index (Charlson, 1987) is a commonly used comorbidity index in outcome studies. Still, the use of different weights makes its calculation cumbersome, while the sum of its components (comorbidities) is easier to compute. In this study, we assessed the effects of 1) the Charlson index adapted for the Swiss population and 2) the sum of its components (number of comorbidities, maximum 15) on a) in-hospital deaths and b) cost of hospitalization. Methods: Anonymous data was obtained from the administrative database of the department of internal medicine of the Lausanne University Hospital (CHUV). All hospitalizations of adult (>=18 years) patients occurring between 2003 and 2011 were included. For each hospitalization, the Charlson index and the number of comorbidities were calculated. Analyses were conducted using Stata. Results: Data from 32,741 hospitalizations occurring between 2003 and 2011 was analyzed. On bivariate analysis, both the Charlson index and the number of comorbidities were significantly and positively associated with in hospital death. Conversely, multivariate adjustment for age, gender and calendar year using Cox regression showed that the association was no longer significant for the number of comorbidities (table). On bivariate analysis, hospitalization costs increased both with Charlson index and with number of comorbidities, but the increase was much steeper for the number of comorbidities (figure). Robust regression after adjusting for age, gender, calendar year and duration of hospital stay showed that the increase in one comorbidity led to an average increase in hospital costs of 321 CHF (95% CI: 272 to 370), while the increase in one score point of the Charlson index led to a decrease in hospital costs of 49 CHF (95% CI: 31 to 67). Conclusion: Charlson index is better than the number of comorbidities in predicting in-hospital death. Conversely, the number of comorbidities significantly increases hospital costs.
Resumo:
Introduction: Mean platelet volume (MPV) was shown to be significantly increased in patients with acute ischaemic stroke, especially in non-lacunar strokes. Moreover, some studies concluded that increased MPV is related to poor functional outcome after ischaemic stroke, although this association is still controversial. However, the determinants of MPV in patients with acute ischaemic stroke have never been investigated. Subjects and methods: We recorded the main demographic, clinical and laboratory data of consecutive patients with acute (admitted within 24 h after stroke onset) ischaemic stroke admitted in our Neurology Service between January 2003 and December 2008. MPV was generated at admission by the Sysmex XE-2100 automated cell counter (Sysmex Corporation, Kobe, Japan) from ethylenediaminetetraacetic acid blood samples stored at room temperature until measurement. The association of these parameters with MPV was investigated in univariate and multivariate analysis. Results: A total of 636 patients was included in our study. The median MPV was 10.4 ± 0.82 fL. In univariate analysis, glucose (β= 0.03, P= 0.05), serum creatinine (β= 0.002, P= 0.02), haemoglobin (β= 0.009, P < 0.001), platelet count (β=-0.002, P < 0.001) and history of arterial hypertension (β= 0.21, P= 0.005) were found to be significantly associated with MPV. In multivariate robust regression analysis, only hypertension and platelet count remained as independent determinants of MPV. Conclusions: In patients with acute ischaemic stroke, platelet count and history of hypertension are the only determinants of MPV.
Resumo:
Studies on water retention and availability are scarce for subtropical or humid temperate climate regions of the southern hemisphere. The aims of this study were to evaluate the relations of the soil physical, chemical, and mineralogical properties with water retention and availability for the generation and validation of continuous point pedotransfer functions (PTFs) for soils of the State of Santa Catarina (SC) in the South of Brazil. Horizons of 44 profiles were sampled in areas under different cover crops and regions of SC, to determine: field capacity (FC, 10 kPa), permanent wilting point (PWP, 1,500 kPa), available water content (AW, by difference), saturated hydraulic conductivity, bulk density, aggregate stability, particle size distribution (seven classes), organic matter content, and particle density. Chemical and mineralogical properties were obtained from the literature. Spearman's rank correlation analysis and path analysis were used in the statistical analyses. The point PTFs for estimation of FC, PWP and AW were generated for the soil surface and subsurface through multiple regression analysis, followed by robust regression analysis, using two sets of predictive variables. Soils with finer texture and/or greater organic matter content retain more moisture, and organic matter is the property that mainly controls the water availability to plants in soil surface horizons. Path analysis was useful in understanding the relationships between soil properties for FC, PWP and AW. The predictive power of the generated PTFs to estimate FC and PWP was good for all horizons, while AW was best estimated by more complex models with better prediction for the surface horizons of soils in Santa Catarina.
Resumo:
By definition, obesity corresponds to the presence of a mass of fatty tissue that is excessive with respect to the body mass. Body fat can be calculated in terms of age and sex by measuring the skinfold thickness in several different places. During the MONICA project, the survey of cardiovascular risk factor prevalence enabled us to measure the thickness of four skinfolds (biceps, triceps, subscapular, suprailiac) in 263 inhabitants of Lausanne (125 men, 138 women). In men aged 25-34, 21 +/- 5% of the body mass was composed of fat, in women 29 +/- 4%. The proportion of fat increases to 31 +/- 7% in men and 41 +/- 6% in women aged 55-64. A robust regression allows body fat to be simply expressed in terms of the body mass index. This allows us to confirm the validity of this index for evaluating the degree of obesity during an epidemiological study.
Resumo:
BACKGROUND AND PURPOSE: Hyperglycemia after stroke is associated with larger infarct volume and poorer functional outcome. In an animal stroke model, the association between serum glucose and infarct volume is described by a U-shaped curve with a nadir ≈7 mmol/L. However, a similar curve in human studies was never reported. The objective of the present study is to investigate the association between serum glucose levels and functional outcome in patients with acute ischemic stroke. METHODS: We analyzed 1446 consecutive patients with acute ischemic stroke. Serum glucose was measured on admission at the emergency department together with multiple other metabolic, clinical, and radiological parameters. National Institutes of Health Stroke Scale (NIHSS) score was recorded at 24 hours, and Rankin score was recorded at 3 and 12 months. The association between serum glucose and favorable outcome (Rankin score ≤2) was explored in univariate and multivariate analysis. The model was further analyzed in a robust regression model based on fractional polynomial (-2-2) functions. RESULTS: Serum glucose is independently correlated with functional outcome at 12 months (OR, 1.15; P=0.01). Other predictors of outcome include admission NIHSS score (OR, 1.18; P<0001), age (OR, 1.06; P<0.001), prestroke Rankin score (OR, 20.8; P=0.004), and leukoaraiosis (OR, 2.21; P=0.016). Using these factors in multiple logistic regression analysis, the area under the receiver-operator characteristic curve is 0.869. The association between serum glucose and Rankin score at 12 months is described by a J-shaped curve with a nadir of 5 mmol/L. Glucose values between 3.7 and 7.3 mmol/L are associated with favorable outcome. A similar curve was generated for the association of glucose and 24-hour NIHSS score, for which glucose values between 4.0 and 7.2 mmol/L are associated with a NIHSS score <7. Discussion-Both hypoglycemia and hyperglycemia are dangerous in acute ischemic stroke as shown by a J-shaped association between serum glucose and 24-hour and 12-month outcome. Initial serum glucose values between 3.7 and 7.3 mmol/L are associated with favorable outcome.
Resumo:
Objetivo: Determinar la ocurrencia de reacciones adversas a medicamentos (RAM) como causa de ingreso a una unidad de cuidado intermedio de un hospital universitario. Materiales y Métodos: Se revisaron las historias clínicas de los pacientes admitidos a la Sala de Emergencias – Cuidado Intermedio (SALEM) entre septiembre y diciembre de 2012 que cumplieron los criterios de inclusión y se detectaron los casos sospechosos de reacción adversa a medicamento (RAM) que posteriormente fueron evaluados por cuatro investigadores respecto a la causalidad a través del Algoritmo de Naranjo, prevenibilidad usando los criterios de Shumock y Thornton y la clasificación clínica mediante el empleo del sistema DoTS. Resultados: Se encontraron 96 pacientes que presentaron 108 casos de RAM. Las RAM más frecuentes fueron las arritmias y la hemorragia de vías digestivas altas (12.04%), 20.3% de los casos correspondieron a fallos terapéuticos, y, los medicamentos mayormente asociados fueron el ácido acetil salicílico (15.74%) y el losartán (10.19%). 46 casos fueron catalogados como posibles y uno solo como definitivo. Usando la clasificación DoTS se estableció que en el 82.4% de los casos la dosis era colateral (dentro del rango de dosis terapéutica), 89.8% fueron independientes del tiempo, y entre los factores mayormente asociados a susceptibilidad a la RAM estuvieron las comorbilidades (41.7%) y la edad (49%). 44% de las RAM fueron prevenibles. Conclusión: Las RAM son una causa de ingreso no despreciable en una unidad de cuidado intermedio para las cuales existen diferentes sistemas de evaluación, y una cantidad significativa de ellas es prevenible. Se requieren más estudios a nivel nacional para evaluar la incidencia de estas y establecer estándares de clasificación y medidas para mitigar su efecto.
Obesity and diabetes, the built environment, and the ‘local’ food economy in the United States, 2007
Resumo:
Obesity and diabetes are increasingly attributed to environmental factors, however, little attention has been paid to the influence of the ‘local’ food economy. This paper examines the association of measures relating to the built environment and ‘local’ agriculture with U.S. county-level prevalence of obesity and diabetes. Key indicators of the ‘local’ food economy include the density of farmers’ markets and the presence of farms with direct sales. This paper employs a robust regression estimator to account for non-normality of the data and to accommodate outliers. Overall, the built environment is associated with the prevalence of obesity and diabetes and a strong local’ food economy may play an important role in prevention. Results imply considerable scope for community-level interventions.
Resumo:
This study investigates the determinants of commercial and retail airport revenues as well as revenues from real estate operations. Cross-sectional OLS, 2SLS and robust regression models of European airports identify a number of significant drivers of airport revenues. Aviation revenues per passenger are mainly determined by the national income per capita in which the airport is located, the percentage of leisure travelers and the size of the airport proxied by total aviation revenues. Main drivers of commercial revenues per passenger include the total number of passengers passing through the airport, the ratio of commercial to total revenues, the national income, the share of domestic and leisure travelers and the total number of flights. These results are in line with previous findings of a negative influence of business travelers on commercial revenues per passenger. We also find that a high amount of retail space per passenger is generally associated with lower commercial revenues per square meter confirming decreasing marginal revenue effects. Real estate revenues per passenger are positively associated with national income per capita at airport location, share of intra-EU passengers and percent delayed flights. Overall, aviation and non-aviation revenues appear to be strongly interlinked, underlining the potential for a comprehensive airport management strategy above and beyond mere cost minimization of the aviation sector.
Resumo:
Drawing upon an updated and expanded dataset of Energy Star and LEED labeled commercial offices, this paper investigates the effect of eco-labeling on rental rates, sale prices and occupancy rates. Using OLS and robust regression procedures, hedonic modeling is used to test whether the presence of an eco-label has a significant positive effect on rental rates, sale prices and occupancy rates. The study suggests that estimated coefficients can be sensitive to outlier treatment. For sale prices and occupancy rates, there are notable differences between estimated coefficients for OLS and robust regressions. The results suggest that both Energy Star and LEED offices obtain rental premiums of approximately 3%. A 17% sale price premium is estimated for Energy Star labeled offices but no significant sale price premium is estimated for LEED labeled offices. Surprisingly, no significant occupancy premium is estimated for Energy Star labeled offices and a negative occupancy premium is estimated for LEED labeled offices.
Resumo:
Objective: Asthma is the most common chronic disease in childhood and has been designated a public health problem due to the increase in its prevalence in recent decades, the amount of health service expenditure it absorbs and an absence of consensus about its etiology. The relationships among psychosocial factors and the occurrence, symptomatology, and severity of asthma have recently been considered. There is still controversy about the association between asthma and a child`s mental health, since the pathways through which this relationship is established are complex and not well researched. This study aims to investigate whether behavior problems are associated with the prevalence of asthma symptoms in a large urban center in Latin America. Methods: It is a cross-section study of 869 children between 6 and 12 years old, residents of Salvador, Brazil. The International Study of Allergy and Asthma in Childhood (ISAAC) instrument was used to evaluate prevalence of asthma symptoms. The Child Behavior Checklist (CBCL) was employed to evaluate behavioral problems. Results: 19.26% (n = 212) of the children presented symptoms of asthma. 35% were classified as having clinical behavioral problems. Poisson`s robust regression model demonstrated a statistically significant association between the presence of behavioral problems and asthma symptoms occurrence (PR: 1.43; 95% Cl: 1.10-1.85). Conclusion: These results suggest an association between behavioral problems and pediatric asthma, and support the inclusion of mental health care in the provision of services for asthma morbidity. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
Financial markets can be viewed as a highly complex evolving system that is very sensitive to economic instabilities. The complex organization of the market can be represented in a suitable fashion in terms of complex networks, which can be constructed from stock prices such that each pair of stocks is connected by a weighted edge that encodes the distance between them. In this work, we propose an approach to analyze the topological and dynamic evolution of financial networks based on the stock correlation matrices. An entropy-related measurement is adopted to quantify the robustness of the evolving financial market organization. It is verified that the network topological organization suffers strong variation during financial instabilities and the networks in such periods become less robust. A statistical robust regression model is proposed to quantity the relationship between the network structure and resilience. The obtained coefficients of such model indicate that the average shortest path length is the measurement most related to network resilience coefficient. This result indicates that a collective behavior is observed between stocks during financial crisis. More specifically, stocks tend to synchronize their price evolution, leading to a high correlation between pair of stock prices, which contributes to the increase in distance between them and, consequently, decrease the network resilience. (C) 2012 American Institute of Physics. [doi:10.1063/1.3683467]
Resumo:
BACKGROUND: Little is known about the population's exposure to radio frequency electromagnetic fields (RF-EMF) in industrialized countries. OBJECTIVES: To examine levels of exposure and the importance of different RF-EMF sources and settings in a sample of volunteers living in a Swiss city. METHODS: RF-EMF exposure of 166 volunteers from Basel, Switzerland, was measured with personal exposure meters (exposimeters). Participants carried an exposimeter for 1 week (two separate weeks in 32 participants) and completed an activity diary. Mean values were calculated using the robust regression on order statistics (ROS) method. RESULTS: Mean weekly exposure to all RF-EMF sources was 0.13 mW/m(2) (0.22 V/m) (range of individual means 0.014-0.881 mW/m(2)). Exposure was mainly due to mobile phone base stations (32.0%), mobile phone handsets (29.1%) and digital enhanced cordless telecommunications (DECT) phones (22.7%). Persons owning a DECT phone (total mean 0.15 mW/m(2)) or mobile phone (0.14 mW/m(2)) were exposed more than those not owning a DECT or mobile phone (0.10 mW/m(2)). Mean values were highest in trains (1.16 mW/m(2)), airports (0.74 mW/m(2)) and tramways or buses (0.36 mW/m(2)), and higher during daytime (0.16 mW/m(2)) than nighttime (0.08 mW/m(2)). The Spearman correlation coefficient between mean exposure in the first and second week was 0.61. CONCLUSIONS: Exposure to RF-EMF varied considerably between persons and locations but was fairly consistent within persons. Mobile phone handsets, mobile phone base stations and cordless phones were important sources of exposure in urban Switzerland.
Resumo:
Spelling is an important literacy skill, and learning to spell is an important component of learning to write. Learners with strong spelling skills also exhibit greater reading, vocabulary, and orthographic knowledge than those with poor spelling skills (Ehri & Rosenthal, 2007; Ehri & Wilce, 1987; Rankin, Bruning, Timme, & Katkanant, 1993). English, being a deep orthography, has inconsistent sound-to-letter correspondences (Seymour, 2005; Ziegler & Goswami, 2005). This poses a great challenge for learners in gaining spelling fluency and accuracy. The purpose of the present study is to examine cross-linguistic transfer of English vowel spellings in Spanish-speaking adult ESL learners. The research participants were 129 Spanish-speaking adult ESL learners and 104 native English-speaking GED students enrolled in a community college located in the South Atlantic region of the United States. The adult ESL participants were in classes at three different levels of English proficiency: advanced, intermediate, and beginning. An experimental English spelling test was administered to both the native English-speaking and ESL participants. In addition, the adult ESL participants took the standardized spelling tests to rank their spelling skills in both English and Spanish. The data were analyzed using robust regression and Poisson regression procedures, Mann-Whitney test, and descriptive statistics. The study found that both Spanish spelling skills and English proficiency are strong predictors of English spelling skills. Spanish spelling is also a strong predictor of level of L1-influenced transfer. More proficient Spanish spellers made significantly fewer L1-influenced spelling errors than less proficient Spanish spellers. L1-influenced transfer of spelling knowledge from Spanish to English likely occurred in three vowel targets (/ɑɪ/ spelled as ae, ai, or ay, /ɑʊ/ spelled as au, and /eɪ/ spelled as e). The ESL participants and the native English-speaking participants produced highly similar error patterns of English vowel spellings when the errors did not indicate L1-influenced transfer, which implies that the two groups might follow similar trajectories of developing English spelling skills. The findings may help guide future researchers or practitioners to modify and develop instructional spelling intervention to meet the needs of adult ESL learners and help them gain English spelling competence.
Resumo:
Spatial data analysis mapping and visualization is of great importance in various fields: environment, pollution, natural hazards and risks, epidemiology, spatial econometrics, etc. A basic task of spatial mapping is to make predictions based on some empirical data (measurements). A number of state-of-the-art methods can be used for the task: deterministic interpolations, methods of geostatistics: the family of kriging estimators (Deutsch and Journel, 1997), machine learning algorithms such as artificial neural networks (ANN) of different architectures, hybrid ANN-geostatistics models (Kanevski and Maignan, 2004; Kanevski et al., 1996), etc. All the methods mentioned above can be used for solving the problem of spatial data mapping. Environmental empirical data are always contaminated/corrupted by noise, and often with noise of unknown nature. That's one of the reasons why deterministic models can be inconsistent, since they treat the measurements as values of some unknown function that should be interpolated. Kriging estimators treat the measurements as the realization of some spatial randomn process. To obtain the estimation with kriging one has to model the spatial structure of the data: spatial correlation function or (semi-)variogram. This task can be complicated if there is not sufficient number of measurements and variogram is sensitive to outliers and extremes. ANN is a powerful tool, but it also suffers from the number of reasons. of a special type ? multiplayer perceptrons ? are often used as a detrending tool in hybrid (ANN+geostatistics) models (Kanevski and Maignank, 2004). Therefore, development and adaptation of the method that would be nonlinear and robust to noise in measurements, would deal with the small empirical datasets and which has solid mathematical background is of great importance. The present paper deals with such model, based on Statistical Learning Theory (SLT) - Support Vector Regression. SLT is a general mathematical framework devoted to the problem of estimation of the dependencies from empirical data (Hastie et al, 2004; Vapnik, 1998). SLT models for classification - Support Vector Machines - have shown good results on different machine learning tasks. The results of SVM classification of spatial data are also promising (Kanevski et al, 2002). The properties of SVM for regression - Support Vector Regression (SVR) are less studied. First results of the application of SVR for spatial mapping of physical quantities were obtained by the authorsin for mapping of medium porosity (Kanevski et al, 1999), and for mapping of radioactively contaminated territories (Kanevski and Canu, 2000). The present paper is devoted to further understanding of the properties of SVR model for spatial data analysis and mapping. Detailed description of the SVR theory can be found in (Cristianini and Shawe-Taylor, 2000; Smola, 1996) and basic equations for the nonlinear modeling are given in section 2. Section 3 discusses the application of SVR for spatial data mapping on the real case study - soil pollution by Cs137 radionuclide. Section 4 discusses the properties of the modelapplied to noised data or data with outliers.
Resumo:
Peer-reviewed