901 resultados para Techniques of data analysis
Resumo:
Background Protein-energy-malnutrition (PEM) is common in people with end stage kidney disease (ESKD) undergoing maintenance haemodialysis (MHD) and correlates strongly with mortality. To this day, there is no gold standard for detecting PEM in patients on MHD. Aim of Study The aim of this study was to evaluate if Nutritional Risk Screening 2002 (NRS-2002), handgrip strength measurement, mid-upper arm muscle area (MUAMA), triceps skin fold measurement (TSF), serum albumin, normalised protein catabolic rate (nPCR), Kt/V and eKt/V, dry body weight, body mass index (BMI), age and time since start on MHD are relevant for assessing PEM in patients on MHD. Methods The predictive value of the selected parameters on mortality and mortality or weight loss of more than 5% was assessed. Quantitative data analysis of the 12 parameters in the same patients on MHD in autumn 2009 (n = 64) and spring 2011 (n = 40) with paired statistical analysis and multivariate logistic regression analysis was performed. Results Paired data analysis showed significant reduction of dry body weight, BMI and nPCR. Kt/Vtot did not change, eKt/v and hand grip strength measurements were significantly higher in spring 2011. No changes were detected in TSF, serum albumin, NRS-2002 and MUAMA. Serum albumin was shown to be the only predictor of death and of the combined endpoint “death or weight loss of more than 5%”. Conclusion We now screen patients biannually for serum albumin, nPCR, Kt/V, handgrip measurement of the shunt-free arm, dry body weight, age and time since initiation of MHD.
Resumo:
IMPORTANCE Some experts suggest that serum thyrotropin levels in the upper part of the current reference range should be considered abnormal, an approach that would reclassify many individuals as having mild hypothyroidism. Health hazards associated with such thyrotropin levels are poorly documented, but conflicting evidence suggests that thyrotropin levels in the upper part of the reference range may be associated with an increased risk of coronary heart disease (CHD). OBJECTIVE To assess the association between differences in thyroid function within the reference range and CHD risk. DESIGN, SETTING, AND PARTICIPANTS Individual participant data analysis of 14 cohorts with baseline examinations between July 1972 and April 2002 and with median follow-up ranging from 3.3 to 20.0 years. Participants included 55,412 individuals with serum thyrotropin levels of 0.45 to 4.49 mIU/L and no previously known thyroid or cardiovascular disease at baseline. EXPOSURES Thyroid function as expressed by serum thyrotropin levels at baseline. MAIN OUTCOMES AND MEASURES Hazard ratios (HRs) of CHD mortality and CHD events according to thyrotropin levels after adjustment for age, sex, and smoking status. RESULTS Among 55,412 individuals, 1813 people (3.3%) died of CHD during 643,183 person-years of follow-up. In 10 cohorts with information on both nonfatal and fatal CHD events, 4666 of 48,875 individuals (9.5%) experienced a first-time CHD event during 533,408 person-years of follow-up. For each 1-mIU/L higher thyrotropin level, the HR was 0.97 (95% CI, 0.90-1.04) for CHD mortality and 1.00 (95% CI, 0.97-1.03) for a first-time CHD event. Similarly, in analyses by categories of thyrotropin, the HRs of CHD mortality (0.94 [95% CI, 0.74-1.20]) and CHD events (0.97 [95% CI, 0.83-1.13]) were similar among participants with the highest (3.50-4.49 mIU/L) compared with the lowest (0.45-1.49 mIU/L) thyrotropin levels. Subgroup analyses by sex and age group yielded similar results. CONCLUSIONS AND RELEVANCE Thyrotropin levels within the reference range are not associated with risk of CHD events or CHD mortality. This finding suggests that differences in thyroid function within the population reference range do not influence the risk of CHD. Increased CHD risk does not appear to be a reason for lowering the upper thyrotropin reference limit.
Resumo:
OBJECTIVE The objective was to determine the risk of stroke associated with subclinical hypothyroidism. DATA SOURCES AND STUDY SELECTION Published prospective cohort studies were identified through a systematic search through November 2013 without restrictions in several databases. Unpublished studies were identified through the Thyroid Studies Collaboration. We collected individual participant data on thyroid function and stroke outcome. Euthyroidism was defined as TSH levels of 0.45-4.49 mIU/L, and subclinical hypothyroidism was defined as TSH levels of 4.5-19.9 mIU/L with normal T4 levels. DATA EXTRACTION AND SYNTHESIS We collected individual participant data on 47 573 adults (3451 subclinical hypothyroidism) from 17 cohorts and followed up from 1972-2014 (489 192 person-years). Age- and sex-adjusted pooled hazard ratios (HRs) for participants with subclinical hypothyroidism compared to euthyroidism were 1.05 (95% confidence interval [CI], 0.91-1.21) for stroke events (combined fatal and nonfatal stroke) and 1.07 (95% CI, 0.80-1.42) for fatal stroke. Stratified by age, the HR for stroke events was 3.32 (95% CI, 1.25-8.80) for individuals aged 18-49 years. There was an increased risk of fatal stroke in the age groups 18-49 and 50-64 years, with a HR of 4.22 (95% CI, 1.08-16.55) and 2.86 (95% CI, 1.31-6.26), respectively (p trend 0.04). We found no increased risk for those 65-79 years old (HR, 1.00; 95% CI, 0.86-1.18) or ≥ 80 years old (HR, 1.31; 95% CI, 0.79-2.18). There was a pattern of increased risk of fatal stroke with higher TSH concentrations. CONCLUSIONS Although no overall effect of subclinical hypothyroidism on stroke could be demonstrated, an increased risk in subjects younger than 65 years and those with higher TSH concentrations was observed.
Resumo:
A problem frequently encountered in Data Envelopment Analysis (DEA) is that the total number of inputs and outputs included tend to be too many relative to the sample size. One way to counter this problem is to combine several inputs (or outputs) into (meaningful) aggregate variables reducing thereby the dimension of the input (or output) vector. A direct effect of input aggregation is to reduce the number of constraints. This, in its turn, alters the optimal value of the objective function. In this paper, we show how a statistical test proposed by Banker (1993) may be applied to test the validity of a specific way of aggregating several inputs. An empirical application using data from Indian manufacturing for the year 2002-03 is included as an example of the proposed test.
Resumo:
In the United States, “binge” drinking among college students is an emerging public health concern due to the significant physical and psychological effects on young adults. The focus is on identifying interventions that can help decrease high-risk drinking behavior among this group of drinkers. One such intervention is Motivational interviewing (MI), a client-centered therapy that aims at resolving client ambivalence by developing discrepancy and engaging the client in change talk. Of late, there is a growing interest in determining the active ingredients that influence the alliance between the therapist and the client. This study is a secondary analysis of the data obtained from the Southern Methodist Alcohol Research Trial (SMART) project, a dismantling trial of MI and feedback among heavy drinking college students. The present project examines the relationship between therapist and client language in MI sessions on a sample of “binge” drinking college students. Of the 126 SMART tapes, 30 tapes (‘MI with feedback’ group = 15, ‘MI only’ group = 15) were randomly selected for this study. MISC 2.1, a mutually exclusive and exhaustive coding system, was used to code the audio/videotaped MI sessions. Therapist and client language were analyzed for communication characteristics. Overall, therapists adopted a MI consistent style and clients were found to engage in change talk. Counselor acceptance, empathy, spirit, and complex reflections were all significantly related to client change talk (p-values ranged from 0.001 to 0.047). Additionally, therapist ‘advice without permission’ and MI Inconsistent therapist behaviors were strongly correlated with client sustain talk (p-values ranged from 0.006 to 0.048). Simple linear regression models showed a significant correlation between MI consistent (MICO) therapist language (independent variable) and change talk (dependent variable) and MI inconsistent (MIIN) therapist language (independent variable) and sustain talk (dependent variable). The study has several limitations such as small sample size, self-selection bias, poor inter-rater reliability for the global scales and the lack of a temporal measure of therapist and client language. Future studies might consider a larger sample size to obtain more statistical power. In addition the correlation between therapist language, client language and drinking outcome needs to be explored.^
Resumo:
Introduction. Food frequency questionnaires (FFQ) are used study the association between dietary intake and disease. An instructional video may potentially offer a low cost, practical method of dietary assessment training for participants thereby reducing recall bias in FFQs. There is little evidence in the literature of the effect of using instructional videos on FFQ-based intake. Objective. This analysis compared the reported energy and macronutrient intake of two groups that were randomized either to watch an instructional video before completing an FFQ or to view the same instructional video after completing the same FFQ. Methods. In the parent study, a diverse group of students, faculty and staff from Houston Community College were randomized to two groups, stratified by ethnicity, and completed an FFQ. The "video before" group watched an instructional video about completing the FFQ prior to answering the FFQ. The "video after" group watched the instructional video after completing the FFQ. The two groups were compared on mean daily energy (Kcal/day), fat (g/day), protein (g/day), carbohydrate (g/day) and fiber (g/day) intakes using descriptive statistics and one-way ANOVA. Demographic, height, and weight information was collected. Dietary intakes were adjusted for total energy intake before the comparative analysis. BMI and age were ruled out as potential confounders. Results. There were no significant differences between the two groups in mean daily dietary intakes of energy, total fat, protein, carbohydrates and fiber. However, a pattern of higher energy intake and lower fiber intake was reported in the group that viewed the instructional video before completing the FFQ compared to those who viewed the video after. Discussion. Analysis of the difference between reported intake of energy and macronutrients showed an overall pattern, albeit not statistically significant, of higher intake in the video before versus the video after group. Application of instructional videos for dietary assessment may require further research to address the validity of reported dietary intakes in those who are randomized to watch an instructional video before reporting diet compared to a control groups that does not view a video.^
Resumo:
Next-generation DNA sequencing platforms can effectively detect the entire spectrum of genomic variation and is emerging to be a major tool for systematic exploration of the universe of variants and interactions in the entire genome. However, the data produced by next-generation sequencing technologies will suffer from three basic problems: sequence errors, assembly errors, and missing data. Current statistical methods for genetic analysis are well suited for detecting the association of common variants, but are less suitable to rare variants. This raises great challenge for sequence-based genetic studies of complex diseases.^ This research dissertation utilized genome continuum model as a general principle, and stochastic calculus and functional data analysis as tools for developing novel and powerful statistical methods for next generation of association studies of both qualitative and quantitative traits in the context of sequencing data, which finally lead to shifting the paradigm of association analysis from the current locus-by-locus analysis to collectively analyzing genome regions.^ In this project, the functional principal component (FPC) methods coupled with high-dimensional data reduction techniques will be used to develop novel and powerful methods for testing the associations of the entire spectrum of genetic variation within a segment of genome or a gene regardless of whether the variants are common or rare.^ The classical quantitative genetics suffer from high type I error rates and low power for rare variants. To overcome these limitations for resequencing data, this project used functional linear models with scalar response to develop statistics for identifying quantitative trait loci (QTLs) for both common and rare variants. To illustrate their applications, the functional linear models were applied to five quantitative traits in Framingham heart studies. ^ This project proposed a novel concept of gene-gene co-association in which a gene or a genomic region is taken as a unit of association analysis and used stochastic calculus to develop a unified framework for testing the association of multiple genes or genomic regions for both common and rare alleles. The proposed methods were applied to gene-gene co-association analysis of psoriasis in two independent GWAS datasets which led to discovery of networks significantly associated with psoriasis.^
Resumo:
Introduction. Despite the ban of lead-containing gasoline and paint, childhood lead poisoning remains a public health issue. Furthermore, a Medicaid-eligible child is 8 times more likely to have an elevated blood lead level (EBLL) than a non-Medicaid child, which is the primary reason for the early detection lead screening mandate for ages 12 and 24 months among the Medicaid population. Based on field observations, there was evidence that suggested a screening compliance issue. Objective. The purpose of this study was to analyze blood lead screening compliance in previously lead poisoned Medicaid children and test for an association between timely lead screening and timely childhood immunizations. The mean months between follow-up tests were also examined for a significant difference between the non-compliant and compliant lead screened children. Methods. Access to the surveillance data of all childhood lead poisoned cases in Bexar County was granted by the San Antonio Metropolitan Health District. A database was constructed and analyzed using descriptive statistics, logistic regression methods and non-parametric tests. Lead screening at 12 months of age was analyzed separately from lead screening at 24 months. The small portion of the population who were also related were included in one analysis and removed from a second analysis to check for significance. Gender, ethnicity, age of home, and having a sibling with an EBLL were ruled out as confounders for the association tests but ethnicity and age of home were adjusted in the nonparametric tests. Results. There was a strong significant association between lead screening compliance at 12 months and childhood immunization compliance, with or without including related children (p<0.00). However, there was no significant association between the two variables at the age of 24 months. Furthermore, there was no significant difference between the median of the mean months of follow-up blood tests among the non-compliant and compliant lead screened population for at the 12 month screening group but there was a significant difference at the 24 month screening group (p<0.01). Discussion. Descriptive statistics showed that 61% and 56% of the previously lead poisoned Medicaid population did not receive their 12 and 24 month mandated lead screening on time, respectively. This suggests that their elevated blood lead level may have been diagnosed earlier in their childhood. Furthermore, a child who is compliant with their lead screening at 12 months of age is 2.36 times more likely to also receive their childhood immunizations on time compared to a child who was not compliant with their 12 month screening. Even though there was no statistical significant association found for the 24 month group, the public health significance of a screening compliance issue is no less important. The Texas Medicaid program needs to enforce lead screening compliance because it is evident that there has been no monitoring system in place. Further recommendations include a need for an increased focus on parental education and the importance of taking their children for wellness exams on time.^
Resumo:
The role of clinical chemistry has traditionally been to evaluate acutely ill or hospitalized patients. Traditional statistical methods have serious drawbacks in that they use univariate techniques. To demonstrate alternative methodology, a multivariate analysis of covariance model was developed and applied to the data from the Cooperative Study of Sickle Cell Disease.^ The purpose of developing the model for the laboratory data from the CSSCD was to evaluate the comparability of the results from the different clinics. Several variables were incorporated into the model in order to control for possible differences among the clinics that might confound any real laboratory differences.^ Differences for LDH, alkaline phosphatase and SGOT were identified which will necessitate adjustments by clinic whenever these data are used. In addition, aberrant clinic values for LDH, creatinine and BUN were also identified.^ The use of any statistical technique including multivariate analysis without thoughtful consideration may lead to spurious conclusions that may not be corrected for some time, if ever. However, the advantages of multivariate analysis far outweigh its potential problems. If its use increases as it should, the applicability to the analysis of laboratory data in prospective patient monitoring, quality control programs, and interpretation of data from cooperative studies could well have a major impact on the health and well being of a large number of individuals. ^
Resumo:
Approximately 795,000 new and recurrent strokes occur each year. Because of the resulting functional impairment, stroke survivors are often discharged into the care of a family caregiver, most often their spouse. This dissertation explored the effect that mutuality, a measure of the perceived positive aspects of the caregiving relationship, had on the stress and depression of 159 stroke survivors and their spousal caregivers over the first 12 months post discharge from inpatient rehabilitation. Specifically, cross-lagged regression was utilized to investigate the dyadic, longitudinal relationship between caregiver and stroke survivor mutuality and caregiver and stroke survivor stress over time. Longitudinal meditational analysis was employed to examine the mediating effect of mutuality on the dyads’ perception of family function and caregiver and stroke survivor depression over time.^ Caregivers’ mutuality was found to be associated with their own stress over time but not the stress of the stroke survivor. Caregivers who had higher mutuality scores over the 12 months of the study had lower perceived stress. Additionally, a partner effect of stress for the stroke survivor but not the caregiver was found, indicating that stroke survivors’ stress over time was associated with caregivers’ stress but caregivers’ stress over time was not significantly associated with the stress of the stroke survivor.^ This dissertation did not find mutuality to mediate the relationship between caregivers’ and stroke survivors’ perception of family function at baseline and their own or their partners’ depression at 12 months as hypothesized. However, caregivers who perceived healthier family functioning at baseline and stroke survivors who had higher perceived mutuality at 12 months had lower depression at one year post discharge from inpatient rehabilitation. Additionally, caregiver mutuality at 6 months, but not at baseline or 12 months, was found to be inversely related to caregiver depression at 12 months.^ These findings highlight the interpersonal nature of stress in the context of caregiving, especially among spousal relationships. Thus, health professionals should encourage caregivers and stroke survivors to focus on the positive aspects of the caregiving relationship in order to mitigate stress and depression. ^
Resumo:
Autoimmune diseases are a group of inflammatory conditions in which the body's immune system attacks its own cells. There are over 80 diseases classified as autoimmune disorders, affecting up to 23.5 million Americans. Obesity affects 32.3% of the US adult population, and could also be considered an inflammatory condition, as indicated by the presence of chronic low-grade inflammation. C-reactive protein (CRP) is a marker of inflammation, and is associated with both adiposity and autoimmune inflammation. This study sought to determine the cross-sectional association between obesity and autoimmune diseases in a large, nationally representative population derived from NHANES 2009–10 data, and the role CRP might play in this relationship. Overall, the results determined that individuals with autoimmune disease were 2.11 times more likely to report being overweight than individuals without autoimmune disease and that CRP had a mediating affect on the obesity-autoimmune relationship. ^
Resumo:
Next-generation sequencing (NGS) technology has become a prominent tool in biological and biomedical research. However, NGS data analysis, such as de novo assembly, mapping and variants detection is far from maturity, and the high sequencing error-rate is one of the major problems. . To minimize the impact of sequencing errors, we developed a highly robust and efficient method, MTM, to correct the errors in NGS reads. We demonstrated the effectiveness of MTM on both single-cell data with highly non-uniform coverage and normal data with uniformly high coverage, reflecting that MTM’s performance does not rely on the coverage of the sequencing reads. MTM was also compared with Hammer and Quake, the best methods for correcting non-uniform and uniform data respectively. For non-uniform data, MTM outperformed both Hammer and Quake. For uniform data, MTM showed better performance than Quake and comparable results to Hammer. By making better error correction with MTM, the quality of downstream analysis, such as mapping and SNP detection, was improved. SNP calling is a major application of NGS technologies. However, the existence of sequencing errors complicates this process, especially for the low coverage (
Resumo:
In September 1999, the International Monetary Fund (IMF) established the Poverty Reduction and Growth Facility (PRGF) to make the reduction of poverty and the enhancement of economic growth the fundamental objectives of lending operations in its poorest member countries. This paper studies the spending and absorption of aid in PRGF-supported programs, verifies whether the use of aid is programmed to be smoothed over time, and analyzes how considerations about macroeconomic stability influence the programmed use of aid. The paper shows that PRGF-supported programs permit countries to utilize all increases in aid within a few years, showing smoothed use of aid inflows over time. Our results reveal that spending is higher than absorption in both the long-run and short-run use of aid, which is a robust finding of the study. Furthermore, the paper demonstrates that the long-run spending exceeds the injected increase of aid inflows in the economy. In addition, the paper finds that the presence of a PRGF-supported program does not influence the actual absorption or spending of aid.
Resumo:
The paper focuses on the recent pattern of government consumption expenditure in developing countries and estimates the determinants which have influenced government expenditure. Using a panel data set for 111 developing countries from 1984 to 2004, this study finds evidence that political and institutional variables as well as governance variables significantly influence government expenditure. Among other results, the paper finds new evidence of Wagner's law which states that peoples' demand for service and willingness to pay is income-elastic hence the expansion of public economy is influenced by the greater economic affluence of a nation (Cameron1978). Corruption is found to be influential in explaining the public expenditure of developing countries. On the contrary, size of the economy and fractionalization are found to have significant negative association with government expenditure. In addition, the study finds evidence that public expenditure significantly shrinks under military dictatorship compared with other form of governance.