942 resultados para Regression To The Mean
Resumo:
Crash reduction factors (CRFs) are used to estimate the potential number of traffic crashes expected to be prevented from investment in safety improvement projects. The method used to develop CRFs in Florida has been based on the commonly used before-and-after approach. This approach suffers from a widely recognized problem known as regression-to-the-mean (RTM). The Empirical Bayes (EB) method has been introduced as a means to addressing the RTM problem. This method requires the information from both the treatment and reference sites in order to predict the expected number of crashes had the safety improvement projects at the treatment sites not been implemented. The information from the reference sites is estimated from a safety performance function (SPF), which is a mathematical relationship that links crashes to traffic exposure. The objective of this dissertation was to develop the SPFs for different functional classes of the Florida State Highway System. Crash data from years 2001 through 2003 along with traffic and geometric data were used in the SPF model development. SPFs for both rural and urban roadway categories were developed. The modeling data used were based on one-mile segments that contain homogeneous traffic and geometric conditions within each segment. Segments involving intersections were excluded. The scatter plots of data show that the relationships between crashes and traffic exposure are nonlinear, that crashes increase with traffic exposure in an increasing rate. Four regression models, namely, Poisson (PRM), Negative Binomial (NBRM), zero-inflated Poisson (ZIP), and zero-inflated Negative Binomial (ZINB), were fitted to the one-mile segment records for individual roadway categories. The best model was selected for each category based on a combination of the Likelihood Ratio test, the Vuong statistical test, and the Akaike's Information Criterion (AIC). The NBRM model was found to be appropriate for only one category and the ZINB model was found to be more appropriate for six other categories. The overall results show that the Negative Binomial distribution model generally provides a better fit for the data than the Poisson distribution model. In addition, the ZINB model was found to give the best fit when the count data exhibit excess zeros and over-dispersion for most of the roadway categories. While model validation shows that most data points fall within the 95% prediction intervals of the models developed, the Pearson goodness-of-fit measure does not show statistical significance. This is expected as traffic volume is only one of the many factors contributing to the overall crash experience, and that the SPFs are to be applied in conjunction with Accident Modification Factors (AMFs) to further account for the safety impacts of major geometric features before arriving at the final crash prediction. However, with improved traffic and crash data quality, the crash prediction power of SPF models may be further improved.
Resumo:
IEECAS SKLLQG
Resumo:
Introduction. Surgical treatment of scoliosis is assessed in the spine clinic by the surgeon making numerous measurements on X-Rays as well as the rib hump. But it is important to understand which of these measures correlate with self-reported improvements in patients’ quality of life following surgery. The objective of this study was to examine the relationship between patient satisfaction after thoracoscopic (keyhole) anterior scoliosis surgery and standard deformity correction measures using the Scoliosis Research Society (SRS) adolescent questionnaire. Methods. A series of 100 consecutive adolescent idiopathic scoliosis patients received a single anterior rod via a keyhole approach at the Mater Children’s Hospital, Brisbane. Patients completed SRS outcomes questionnaires before surgery and again at 24 months after surgery. Multiple regression and t-tests were used to investigate the relationship between SRS scores and deformity correction achieved after surgery. Results. There were 94 females and 6 males with a mean age of 16.1 years. The mean Cobb angle improved from 52º pre-operatively to 21º for the instrumented levels post-operatively (59% correction) and the mean rib hump improved from 16º to 8º (51% correction). The mean total SRS score for the cohort was 99.4/120 which indicated a high level of satisfaction with the results of their scoliosis surgery. None of the deformity related parameters in the multiple regressions were significant. However, the twenty patients with the smallest Cobb angles after surgery reported significantly higher SRS scores than the twenty patients with the largest Cobb angles after surgery, but there was no difference on the basis of rib hump correction. Discussion. Patients undergoing thoracoscopic (keyhole) anterior scoliosis correction report good SRS scores which are comparable to those in previous studies. We suggest that the absence of any statistically significant difference in SRS scores between patients with and without rod or screw complications is because these complications are not associated with any clinically significant loss of correction in our patient group. The Cobb angle after surgery was the only significant predictor of patient satisfaction when comparing subgroups of patients with the largest and smallest Cobb angles after surgery.
Resumo:
Recently, mean-variance analysis has been proposed as a novel paradigm to model document ranking in Information Retrieval. The main merit of this approach is that it diversifies the ranking of retrieved documents. In its original formulation, the strategy considers both the mean of relevance estimates of retrieved documents and their variance. How- ever, when this strategy has been empirically instantiated, the concepts of mean and variance are discarded in favour of a point-wise estimation of relevance (to replace the mean) and of a parameter to be tuned or, alternatively, a quantity dependent upon the document length (to replace the variance). In this paper we revisit this ranking strategy by going back to its roots: mean and variance. For each retrieved document, we infer a relevance distribution from a series of point-wise relevance estimations provided by a number of different systems. This is used to compute the mean and the variance of document relevance estimates. On the TREC Clueweb collection, we show that this approach improves the retrieval performances. This development could lead to new strategies to address the fusion of relevance estimates provided by different systems.
Resumo:
Background Supine imaging modalities provide valuable 3D information on scoliotic anatomy, but the altered spine geometry between the supine and standing positions affects the Cobb angle measurement. Previous studies report a mean 7°-10° Cobb angle increase from supine to standing, but none have reported the effect of endplate pre-selection or whether other parameters affect this Cobb angle difference. Methods Cobb angles from existing coronal radiographs were compared to those on existing low-dose CT scans taken within three months of the reference radiograph for a group of females with adolescent idiopathic scoliosis. Reformatted coronal CT images were used to measure supine Cobb angles with and without endplate pre-selection (end-plates selected from the radiographs) by two observers on three separate occasions. Inter and intra-observer measurement variability were assessed. Multi-linear regression was used to investigate whether there was a relationship between supine to standing Cobb angle change and eight variables: patient age, mass, standing Cobb angle, Risser sign, ligament laxity, Lenke type, fulcrum flexibility and time delay between radiograph and CT scan. Results Fifty-two patients with right thoracic Lenke Type 1 curves and mean age 14.6 years (SD 1.8) were included. The mean Cobb angle on standing radiographs was 51.9° (SD 6.7). The mean Cobb angle on supine CT images without pre-selection of endplates was 41.1° (SD 6.4). The mean Cobb angle on supine CT images with endplate pre-selection was 40.5° (SD 6.6). Pre-selecting vertebral endplates increased the mean Cobb change by 0.6° (SD 2.3, range −9° to 6°). When free to do so, observers chose different levels for the end vertebrae in 39% of cases. Multi-linear regression revealed a statistically significant relationship between supine to standing Cobb change and fulcrum flexibility (p = 0.001), age (p = 0.027) and standing Cobb angle (p < 0.001). The 95% confidence intervals for intra-observer and inter-observer measurement variability were 3.1° and 3.6°, respectively. Conclusions Pre-selecting vertebral endplates causes minor changes to the mean supine to standing Cobb change. There is a statistically significant relationship between supine to standing Cobb change and fulcrum flexibility such that this difference can be considered a potential alternative measure of spinal flexibility.
Resumo:
The critical behavior of osmotic susceptibility in an aqueous electrolyte mixture 1-propanol (1P)+water (W)+potassium chloride is reported. This mixture exhibits re-entrant phase transitions and has a nearly parabolic critical line with its apex representing a double critical point (DCP). The behavior of the susceptibility exponent is deduced from static light-scattering measurements, on approaching the lower critical solution temperatures (TL’s) along different experimental paths (by varying t) in the one-phase region. The light-scattering data analysis substantiates the existence of a nonmonotonic crossover behavior of the susceptibility exponent in this mixture. For the TL far away from the DCP, the effective susceptibility exponent γeff as a function of t displays a nonmonotonic crossover from its single limit three-dimensional (3D)-Ising value ( ∼ 1.24) toward its mean-field value with increase in t. While for that closest to the DCP, γeff displays a sharp, nonmonotonic crossover from its nearly doubled 3D-Ising value toward its nearly doubled mean-field value with increase in t. The renormalized Ising regime extends over a relatively larger t range for the TL closest to the DCP, and a trend toward shrinkage in the renormalized Ising regime is observed as TL shifts away from the DCP. Nevertheless, the crossover to the mean-field limit extends well beyond t>10−2 for the TL’s studied. The observed crossover behavior is attributed to the presence of strong ion-induced clustering in this mixture, as revealed by various structure probing techniques. As far as the critical behavior in complex or associating mixtures with special critical points (like the DCP) is concerned, our results indicate that the influence of the DCP on the critical behavior must be taken into account not only on the renormalization of the critical exponent but also on the range of the Ising regime, which can shrink with decrease in the influence of the DCP and with the extent of structuring in the system. The utility of the field variable tUL in analyzing re-entrant phase transitions is demonstrated. The effective susceptibility exponent as a function of tUL displays a nonmonotonic crossover from its asymptotic 3D-Ising value toward a value slightly lower than its nonasymptotic mean-field value of 1. This behavior in the nonasymptotic, high tUL region is interpreted in terms of the possibility of a nonmonotonic crossover to the mean-field value from lower values, as foreseen earlier in micellar systems.
Resumo:
We consider the effect of subdividing the potential barrier along the reaction coordinate on Kramer's escape rate for a model potential, Using the known supersymmetric potential approach, we show the existence of an optimal number of subdivisions that maximizes the rate, We cast the problem as a mean first passage time problem of a biased random walker and obtain equivalent results, We briefly summarize the results of our investigation on the increase in the escape rate by placing a blow-torch in the unstable part of one of the potential wells. (C) 1999 Elsevier Science B.V. All rights reserved.
Resumo:
The potential of Raman spectroscopy for the determination of meat quality attributes has been investigated using data from a set of 52 cooked beef samples, which were rated by trained taste panels. The Raman spectra, shear force and cooking loss were measured and PLS used to correlate the attributes with the Raman data. Good correlations and standard errors of prediction were found when the Raman data were used to predict the panels' rating of acceptability of texture (R-2 = 0.71, Residual Mean Standard Error of Prediction (RMSEP)% of the mean (mu) = 15%), degree of tenderness (R-2 = 0.65, RMSEP% of mu = 18%), degree of juiciness (R-2 = 0.62, RMSEP% of mu = 16%), and overall acceptability (R-2 = 0.67, RMSEP% of mu = 11%). In contrast, the mechanically determined shear force was poorly correlated with tenderness (R-2 = 0.15). Tentative interpretation of the plots of the regression coefficients suggests that the alpha-helix to beta-sheet ratio of the proteins and the hydrophobicity of the myofibrillar environment are important factors contributing to the shear force, tenderness, texture and overall acceptability of the beef. In summary, this work demonstrates that Raman spectroscopy can be used to predict consumer-perceived beef quality. In part, this overall success is due to the fact that the Raman method predicts texture and tenderness, which are the predominant factors in determining overall acceptability in the Western world. Nonetheless, it is clear that Raman spectroscopy has considerable potential as a method for non-destructive and rapid determination of beef quality parameters.
Resumo:
Le but de cette thèse est d étendre la théorie du bootstrap aux modèles de données de panel. Les données de panel s obtiennent en observant plusieurs unités statistiques sur plusieurs périodes de temps. Leur double dimension individuelle et temporelle permet de contrôler l 'hétérogénéité non observable entre individus et entre les périodes de temps et donc de faire des études plus riches que les séries chronologiques ou les données en coupe instantanée. L 'avantage du bootstrap est de permettre d obtenir une inférence plus précise que celle avec la théorie asymptotique classique ou une inférence impossible en cas de paramètre de nuisance. La méthode consiste à tirer des échantillons aléatoires qui ressemblent le plus possible à l échantillon d analyse. L 'objet statitstique d intérêt est estimé sur chacun de ses échantillons aléatoires et on utilise l ensemble des valeurs estimées pour faire de l inférence. Il existe dans la littérature certaines application du bootstrap aux données de panels sans justi cation théorique rigoureuse ou sous de fortes hypothèses. Cette thèse propose une méthode de bootstrap plus appropriée aux données de panels. Les trois chapitres analysent sa validité et son application. Le premier chapitre postule un modèle simple avec un seul paramètre et s 'attaque aux propriétés théoriques de l estimateur de la moyenne. Nous montrons que le double rééchantillonnage que nous proposons et qui tient compte à la fois de la dimension individuelle et la dimension temporelle est valide avec ces modèles. Le rééchantillonnage seulement dans la dimension individuelle n est pas valide en présence d hétérogénéité temporelle. Le ré-échantillonnage dans la dimension temporelle n est pas valide en présence d'hétérogénéité individuelle. Le deuxième chapitre étend le précédent au modèle panel de régression. linéaire. Trois types de régresseurs sont considérés : les caractéristiques individuelles, les caractéristiques temporelles et les régresseurs qui évoluent dans le temps et par individu. En utilisant un modèle à erreurs composées doubles, l'estimateur des moindres carrés ordinaires et la méthode de bootstrap des résidus, on montre que le rééchantillonnage dans la seule dimension individuelle est valide pour l'inférence sur les coe¢ cients associés aux régresseurs qui changent uniquement par individu. Le rééchantillonnage dans la dimen- sion temporelle est valide seulement pour le sous vecteur des paramètres associés aux régresseurs qui évoluent uniquement dans le temps. Le double rééchantillonnage est quand à lui est valide pour faire de l inférence pour tout le vecteur des paramètres. Le troisième chapitre re-examine l exercice de l estimateur de différence en di¤érence de Bertrand, Duflo et Mullainathan (2004). Cet estimateur est couramment utilisé dans la littérature pour évaluer l impact de certaines poli- tiques publiques. L exercice empirique utilise des données de panel provenant du Current Population Survey sur le salaire des femmes dans les 50 états des Etats-Unis d Amérique de 1979 à 1999. Des variables de pseudo-interventions publiques au niveau des états sont générées et on s attend à ce que les tests arrivent à la conclusion qu il n y a pas d e¤et de ces politiques placebos sur le salaire des femmes. Bertrand, Du o et Mullainathan (2004) montre que la non-prise en compte de l hétérogénéité et de la dépendance temporelle entraîne d importantes distorsions de niveau de test lorsqu'on évalue l'impact de politiques publiques en utilisant des données de panel. Une des solutions préconisées est d utiliser la méthode de bootstrap. La méthode de double ré-échantillonnage développée dans cette thèse permet de corriger le problème de niveau de test et donc d'évaluer correctement l'impact des politiques publiques.
Resumo:
Resumen tomado de la publicaci??n
Resumo:
The present essay’s central argument or hypothesis is, consequently, that the mechanisms accelerating a wealth concentrating and exclusionary economy centred on the benefit and overprotection of big business—with a corresponding plundering of resources that are vital for life—generated forms of loss and regression in the right to healthcare and the dismantling of institutional protections. These are all expressed in indicators from 1990-2005, which point not only to the deterioration of healthcare programs and services but also to the undermining of the general conditions of life (social reproduction) and, in contrast to the reports and predictions of the era’s governments, a stagnation or deterioration in health indicators, especially for those most sensitive to the crisis. The present study’s argument is linked together across distinct chapters. First, we undertake the necessary clarification of the categories central to the understanding of a complex issue; clarifying the concept of health itself and its determinants, emphasizing the necessity of taking on an integral understanding as a fundamental prerequisite to unravelling what documents and reports from this era either leave unsaid or distort. Based on that analysis, we will explain the harmful effects of global economic acceleration, the monopolization and pillaging of strategic healthcare goods; not only those which directly place obstacles on the access to health services, but also those like the destructuration of small economies, linked to the impoverishment and worsening of living modes. Thinking epidemiologically, we intend to show signs of the deterioration of broad collectivities’ ways of life as a result of the mechanisms of acceleration and pillage. We will then collect disparate evidence of the deterioration of human health and ecosystems to, finally, establish the most urgent conclusions about this unfortunate period of our social and medical history.