13 resultados para COMPETING RISKS
em DigitalCommons@The Texas Medical Center
Resumo:
The relationship between serum cholesterol and cancer incidence was investigated in the population of the Hypertension Detection and Follow-up Program (HDFP). The HDFP was a multi-center trial designed to test the effectiveness of a stepped program of medication in reducing mortality associated with hypertension. Over 10,000 participants, ages 30-69, were followed with clinic and home visits for a minimum of five years. Cancer incidence was ascertained from existing study documents, which included hospitalization records, autopsy reports and death certificates. During the five years of follow-up, 286 new cancer cases were documented. The distribution of sites and total number of cases were similar to those predicted using rates from the Third National Cancer Survey. A non-fasting baseline serum cholesterol level was available for most participants. Age, sex, and race specific five-year cancer incidence rates were computed for each cholesterol quartile. Rates were also computed by smoking status, education status, and percent ideal weight quartiles. In addition, these and other factors were investigated with the use of the multiple logistic model.^ For all cancers combined, a significant inverse relationship existed between baseline serum cholesterol levels and cancer incidence. Previously documented associations between smoking, education and cancer were also demonstrated but did not account for the relationship between serum cholesterol and cancer. The relationship was more evident in males than females but this was felt to represent the different distribution of occurrence of specific cancer sites in the two sexes. The inverse relationship existed for all specific sites investigated (except breast) although a level of statistical significance was reached only for prostate carcinoma. Analyses after exclusion of cases diagnosed during the first two years of follow-up still yielded an inverse relationship. Life table analysis indicated that competing risks during the period of follow-up did not account for the existence of an inverse relationship. It is concluded that a weak inverse relationship does exist between serum cholesterol for many but not all cancer sites. This relationship is not due to confounding by other known cancer risk factors, competing risks or persons entering the study with undiagnosed cancer. Not enough information is available at the present time to determine whether this relationship is causal and further research is suggested. ^
Resumo:
A life table methodology was developed which estimates the expected remaining Army service time and the expected remaining Army sick time by years of service for the United States Army population. A measure of illness impact was defined as the ratio of expected remaining Army sick time to the expected remaining Army service time. The variances of the resulting estimators were developed on the basis of current data. The theory of partial and complete competing risks was considered for each type of decrement (death, administrative separation, and medical separation) and for the causes of sick time.^ The methodology was applied to world-wide U.S. Army data for calendar year 1978. A total of 669,493 enlisted personnel and 97,704 officers were reported on active duty as of 30 September 1978. During calendar year 1978, the Army Medical Department reported 114,647 inpatient discharges and 1,767,146 sick days. Although the methodology is completely general with respect to the definition of sick time, only sick time associated with an inpatient episode was considered in this study.^ Since the temporal measure was years of Army service, an age-adjusting process was applied to the life tables for comparative purposes. Analyses were conducted by rank (enlisted and officer), race and sex, and were based on the ratio of expected remaining Army sick time to expected remaining Army service time. Seventeen major diagnostic groups, classified by the Eighth Revision, International Classification of Diseases, Adapted for Use In The United States, were ranked according to their cumulative (across years of service) contribution to expected remaining sick time.^ The study results indicated that enlisted personnel tend to have more expected hospital-associated sick time relative to their expected Army service time than officers. Non-white officers generally have more expected sick time relative to their expected Army service time than white officers. This racial differential was not supported within the enlisted population. Females tend to have more expected sick time relative to their expected Army service time than males. This tendency remained after diagnostic groups 580-629 (Genitourinary System) and 630-678 (Pregnancy and Childbirth) were removed. Problems associated with the circulatory system, digestive system and musculoskeletal system were among the three leading causes of cumulative sick time across years of service. ^
Resumo:
Evaluation of the impact of a disease on life expectancy is an important part of public health. Potential gains in life expectancy (PGLE) that can properly take into account the competing risks are an effective indicator for measuring the impact of the multiple causes of death. This study aimed to measure the PGLEs from reducing/eliminating the major causes of death in the USA from 2001 to 2008. To calculate the PGLEs due to the elimination of specific causes of death, the age-specific mortality rates for heart disease, malignant neoplasms, Alzheimer disease, kidney diseases and HIV/AIDS and life table constructing data were obtained from the National Center for Health Statistics, and the multiple decremental life tables were constructed. The PGLEs by elimination of heart disease, malignant neoplasms or HIV/AIDS continued decreasing from 2001 to 2008, but the PGLE by elimination of Alzheimer's disease or kidney diseases revealed increased trends. The PGLEs (by years) for all race, male, female, white, white male, white female, black, black male and black female at birth by complete elimination of heart disease 2001–2008 were 0.336–0.299, 0.327–0.301, 0.344–0.295, 0.360–0.315, 0.349–0.317, 0.371–0.316,0.278–0.251, 0.272–0.255, and 0.282–0.246 respectively. Similarly, the PGLEs (by years) for all race, male, female, white, white male, white female, black, black male and black female at birth by complete elimination of malignant neoplasms, Alzheimer's disease, kidney disease or HIV/AIDS 2001–2008 were also uncovered, respectively. Most diseases affect specific population, such as, HIV/AIDS tends to have a greater impact on people of working age, heart disease and malignant neoplasms have a greater impact on people over 65 years of age, but Alzheimer's disease and kidney diseases have a greater impact on people over 75 years of age. To measure the impact of these diseases on life expectancy in people of working age, partial multiple decremental life tables were constructed and the PGLEs were computed by partial or complete elimination of various causes of death during the working years. Thus, the results of the study outlined a picture of how each single disease could affect the life expectancy in age-, race-, or sex-specific population in USA. Therefore, the findings would not only assist to evaluate current public health improvements, but also provide useful information for future research and disease control programs.^
Resumo:
It is widely accepted that hypoplastic left heart syndrome (HLHS), aortic valve stenosis with or without bicuspid aortic valve (AS/BAV) and coarctation of the aorta (CoA) occur in families more commonly with each other than with any other congenital heart defect (CHD). Genetic counseling for CHDs is currently based on empiric risk estimates derived from data collected on all types of CHDs between 1968 and 1990. Additionally, for the specific group of defects described above, termed left-sided lesions, estimates are available for sibling recurrence. Utilizing family history data from 757 probands recruited between 1997 and 2007 from The Children’s Hospital of Philadelphia, this study reassessed the pre/recurrence risks for LSLs specifically. Sibling pre/recurrence risks for HLHS (5.5%, 95% CI: 3.1%-8.9%), CoA (4.0%, 95% CI: 2.1%-6.7%), and AS/BAV (6.0%, 95% CI: 3.3%-9.8%) were higher than currently quoted risks based on sibling data for individual LSLs. Additionally, the prevalence of BAV in 202, apparently unaffected, parents of 134 probands was assessed by echocardiography. BAV, which occurs at a frequency of 1% in the general population, was found to occur in approximately 10% of parents of LSL probands. Lastly, among affected first-degree relative pairs (i.e. siblings, parent-offspring), the majority (65%-70%) were both affected with a LSL. Defect specific concordance rates were highest for AS/BAV. Together, these findings suggest that over the past 20 years with changing diagnostic capabilities and environmental/maternal conditions (e.g. folic acid fortification, increased maternal diabetes and obesity) recurrence risks may have increased, as compared to current LSL specific risk estimates. Based on these risk estimate increases and prior studies, a protocol for screening first-degree relatives of LSL probands should be devised.
Resumo:
This study examines the relationship among psychological resources (generalized resistance resources), care demands (demands for care, competing demands, perception of burden) and cognitive stress in a selected population of primary family caregivers. The study utilizes Antonovsky's Salutogenic Model of Health, specifically the concept of generalized resistance resources (GRRs), to analyze the relative effect of these resources on mediating cognitive stress, controlling for other care demands. The study is based on a sample of 784 eligible caregivers who (1) were relatives, (2) had the main responsibility for care, defined as a primary caregiver, and (3) provided a scaled stress score for the amount of overall care given to the care recipient (family member). The sample was drawn from the 1982 National Long-Term Care Survey (NLTCS) of individuals who assisted a given NLTCS sample person with ADL limitations.^ The study tests the following hypotheses: (a) There will be a negative relationship between generalized resistance resources (GRRs) and cognitive stress controlling for care demands (demands for care, competing demands, and perceptions of burden); (b) of the specific GRRs (material, cognitive, social, cultural-environmental) the social domain will represent the most significant factor predicting a decrease in cognitive stress; and (c) the social domain will be more significant for the female than the male primary family caregiver in decreasing cognitive stress.^ The study found that GRRs had a statistically significant mediating effect on cognitive stress, but the GRRs were a less significant predictor of stress than perception of burden and demands for care. Thus, although the analysis supported the underlying hypothesis, the specific hypothesis regarding GRRs' greater significance in buffering cognitive stress was not supported. Second, the results did not demonstrate the statistical significance or differences among the GRR domains. The hypothesis that the social GRR domain was most significant in mediating stress of family caregivers was not supported. Finally, the results confirmed that there are differences in the importance of social support help in mediating stress based on gender. It was found that gender and social support help were related to cognitive stress and gender had a statistically significant interaction effect with social support help. Implications for clinical practice, public health policy, and research are discussed. ^
Resumo:
The Food and Drug Administration (FDA) is responsible for risk assessment and risk management in the post-market surveillance of the U.S. medical device industry. One of the FDA regulatory mechanisms, the Medical Device Reporting System (MDR) is an adverse event reporting system intended to provide the FDA with advance warning of device problems. It includes voluntary reporting for individuals, and mandatory reporting for device manufacturers. ^ In a study of alleged breast implant safety problems, this research examines the organizational processes by which the FDA gathers data on adverse events and uses adverse event reporting systems to assess and manage risk. The research reviews the literature on problem recognition, risk perception, and organizational learning to understand the influence highly publicized events may have on adverse event reporting. Understanding the influence of an environmental factor, such as publicity, on adverse event reporting can provide insight into the question of whether the FDA's adverse event reporting system operates as an early warning system for medical device problems. ^ The research focuses on two main questions. The first question addresses the relationship between publicity and the voluntary and mandatory reporting of adverse events. The second question examines whether government agencies make use of these adverse event reports. ^ Using quantitative and qualitative methods, a longitudinal study was conducted of the number and content of adverse event reports regarding breast implants filed with the FDA's medical device reporting system during 1985–1991. To assess variation in publicity over time, the print media were analyzed to identify articles related to breast implant failures. ^ The exploratory findings suggest that an increase in media activity is related to an increase in voluntary reporting, especially following periods of intense media coverage of the FDA. However, a similar relationship was not found between media activity and manufacturers' mandatory adverse event reporting. A review of government committee and agency reports on the FDA published during 1976–1996 produced little evidence to suggest that publicity or MDR information contributed to problem recognition, agenda setting, or the formulation of policy recommendations. ^ The research findings suggest that the reporting of breast implant problems to FDA may reflect the perceptions and concerns of the reporting groups, a barometer of the volume and content of media attention. ^
Resumo:
Every fifth unintentional injury treated at a healthcare facility in industrialized nations is associated with sports or physical exercise. Though the benefits of exercise on health status are well documented and, for most individuals, far outweigh the risks, participation in sports and exercise programs does carry a risk of injury, illness, or even death. In an effort to decrease these risks most institutions in the United States, and in the industrialized world, require a pre-participation physical examination for all athletes competing in organized or scholastic sports or exercise programs. Over the last ten years the popularity of outdoor or wilderness sports has increased enormously. Traditional outdoor sports such as skiing and hiking are more popular than ever and sports that did not exist 10 to 15 years ago, such as adventure racing or mountain biking, are now multimillion dollar enterprises. This genre of sport appeals to a broad spectrum of individuals and combines the traditional risks of physical activity and exertion with the remoteness and exposure associated with wilderness environments. Wilderness athletes include people of all ages and of both genders. The main causes of morbidity are musculoskeletal injuries and gastrointestinal illnesses; the main causes of mortality are falls and cardiac events. By placing these causes in a Haddon Matrix, preventative strategies have been found and recommendations made specifically for the preparticipation physical examination, which include education about the causes of morbidity and mortality in wilderness athletes, instruction about preventing and treating these injuries and illnesses, and screening of athletes at risk for cardiovascular accidents. Through these measures the risk of injuries, illnesses and deaths in wilderness athletes can be decreased through out the world. ^
Resumo:
External beam radiation therapy is used to treat nearly half of the more than 200,000 new cases of prostate cancer diagnosed in the United States each year. During a radiation therapy treatment, healthy tissues in the path of the therapeutic beam are exposed to high doses. In addition, the whole body is exposed to a low-dose bath of unwanted scatter radiation from the pelvis and leakage radiation from the treatment unit. As a result, survivors of radiation therapy for prostate cancer face an elevated risk of developing a radiogenic second cancer. Recently, proton therapy has been shown to reduce the dose delivered by the therapeutic beam to normal tissues during treatment compared to intensity modulated x-ray therapy (IMXT, the current standard of care). However, the magnitude of stray radiation doses from proton therapy, and their impact on this incidence of radiogenic second cancers, was not known. ^ The risk of a radiogenic second cancer following proton therapy for prostate cancer relative to IMXT was determined for 3 patients of large, median, and small anatomical stature. Doses delivered to healthy tissues from the therapeutic beam were obtained from treatment planning system calculations. Stray doses from IMXT were taken from the literature, while stray doses from proton therapy were simulated using a Monte Carlo model of a passive scattering treatment unit and an anthropomorphic phantom. Baseline risk models were taken from the Biological Effects of Ionizing Radiation VII report. A sensitivity analysis was conducted to characterize the uncertainty of risk calculations to uncertainties in the risk model, the relative biological effectiveness (RBE) of neutrons for carcinogenesis, and inter-patient anatomical variations. ^ The risk projections revealed that proton therapy carries a lower risk for radiogenic second cancer incidence following prostate irradiation compared to IMXT. The sensitivity analysis revealed that the results of the risk analysis depended only weakly on uncertainties in the risk model and inter-patient variations. Second cancer risks were sensitive to changes in the RBE of neutrons. However, the findings of the study were qualitatively consistent for all patient sizes and risk models considered, and for all neutron RBE values less than 100. ^
Resumo:
Based on asthma prevalence data collected from the 2000 BRFSS survey, approximately 14.7 million U.S. adults had current asthma, accounting for 7.2% of the total U.S. population. In Texas alone, state data extrapolated from the 1999-2003 Texas BRFSS suggested that approximately 1 million Texas adults were reporting current asthma and approximately 11% of the adult population has been diagnosed with the illness during their lifetime. From a public health perspective, the disease is manageable. Comprehensive state-specific asthma surveillance data are necessary to identify disparities in asthma prevalence and asthma-control characteristics among subpopulations and to develop targeted public health interventions. The purpose of this study was to determine the relative importance of various risk factors of asthma and to examine the impact of asthma on health-related quality of life among adult residents of Texas. ^ The study employed a cross-sectional study of respondents in Texas. The study extracted all the variables related to asthma along with their associated demographic, socioeconomic, and quality of life variables from the 2007 BRFSS data for 17,248 adult residents of Texas aged 18 and older. Chi-square test and logistic regression using SPSS were used in various data analyses on weighted data, adjusting for the complex sample design of the BRFSS data. All chi-square analyses were carried out using SPSS's CSTABULATE command. In addition, logistic regression models were fitted using SPSS's CSLOGISTIC command. ^ Risks factors significantly associated with reporting current asthma included BMI, race/ethnicity, gender, and income. Holding all other variables constant, obese adults were almost twice as likely to report current asthma as those adults who were normal weight (odds ratio [OR], 1.78; 95% confidence interval [CI], 1.25 to 2.53). Other non-Hispanic adults were significantly more likely to report current asthma than non-Hispanic Whites (OR, 2.43; 95% CI, 1.38 to 4.25), while Hispanics were significantly less likely to report current asthma than non-Hispanic Whites (OR, 0.38; 95% CI, 0.25 to 0.60), after controlling for all other variables. After adjusting for all other variables, adult females were almost twice as likely to report current asthma as males (OR, 1.97; 95% CI, 1.49 to 2.60). Adults with household income of less than $15,000 were almost twice as likely to report current asthma as those persons with an annual household income of $50,000 or more (OR, 1.98; 95% CI, 1.33 to 2.94). In regards to the association between asthma and health-related quality of life, after adjusting for age, race/ethnicity, gender, tobacco use, body mass index (BMI), exercise, education, and income, adults with current asthma compared to those without asthma were more likely to report having more than 15 days of unhealthy physical health (OR, 1.84; 95% CI, 1.29 to 2.60). ^ Overall, the findings of this study provide insight and valuable information into the populations in Texas most adversely affected by asthma and health-related consequences of the disease condition. Further research could build on the findings of this study by replicating this study as closely as possible in other asthma settings, and look at the relationship for hospitalization rates, asthma severity, and mortality.^
Resumo:
The occurrence of waste pharmaceuticals has been identified and well documented in water sources throughout North America and Europe. Many studies have been conducted which identify the occurrence of various pharmaceutical compounds in these waters. This project is an extensive review of the documented evidence of this occurrence published in the scientific literature. This review was performed to determine if this occurrence has a significant impact on the environment and public health. This project and review found that pharmaceuticals such as sex hormone drugs, antibiotic drugs and antineoplastic/cytostatic agents as well as their metabolites have been found to occur in water sources throughout the United States at levels high enough to have noticeable impacts on human health and the environment. It was determined that the primary sources of this occurrence of pharmaceuticals were waste water effluent and solid wastes from sewage treatment plants, pharmaceutical manufacturing plants, healthcare and biomedical research facilities, as well as runoff from veterinary medicine applications (including aquaculture). ^ In addition, current public policies of US governmental agencies such as the Environmental Protection Agency (EPA), Food and Drug Administration (FDA), and Drug Enforcement Agency (DEA) have been evaluated to see if they are doing a sufficient job at controlling this issue. Specific recommendations for developing these EPA, FDA, and DEA policies have been made to mitigate, prevent, or eliminate this issue.^ Other possible interventions such as implementing engineering controls were also evaluated in order to mitigate, prevent and eliminate this issue. These engineering controls include implementing improved current treatment technologies such as the advancement and improvement of waste water treatment processes utilized by conventional sewage treatment and pharmaceutical manufacturing plants. In addition, administrative controls such as the use of “green chemistry” in drug synthesis and design were also explored and evaluated as possible alternatives to mitigate, prevent, or eliminate this issue. Specific recommendations for incorporating these engineering and administrative controls into the applicable EPA, FDA, and DEA policies have also been made.^
Resumo:
Additive and multiplicative models of relative risk were used to measure the effect of cancer misclassification and DS86 random errors on lifetime risk projections in the Life Span Study (LSS) of Hiroshima and Nagasaki atomic bomb survivors. The true number of cancer deaths in each stratum of the cancer mortality cross-classification was estimated using sufficient statistics from the EM algorithm. Average survivor doses in the strata were corrected for DS86 random error ($\sigma$ = 0.45) by use of reduction factors. Poisson regression was used to model the corrected and uncorrected mortality rates with covariates for age at-time-of-bombing, age at-time-of-death and gender. Excess risks were in good agreement with risks in RERF Report 11 (Part 2) and the BEIR-V report. Bias due to DS86 random error typically ranged from $-$15% to $-$30% for both sexes, and all sites and models. The total bias, including diagnostic misclassification, of excess risk of nonleukemia for exposure to 1 Sv from age 18 to 65 under the non-constant relative projection model was $-$37.1% for males and $-$23.3% for females. Total excess risks of leukemia under the relative projection model were biased $-$27.1% for males and $-$43.4% for females. Thus, nonleukemia risks for 1 Sv from ages 18 to 85 (DRREF = 2) increased from 1.91%/Sv to 2.68%/Sv among males and from 3.23%/Sv to 4.02%/Sv among females. Leukemia excess risks increased from 0.87%/Sv to 1.10%/Sv among males and from 0.73%/Sv to 1.04%/Sv among females. Bias was dependent on the gender, site, correction method, exposure profile and projection model considered. Future studies that use LSS data for U.S. nuclear workers may be downwardly biased if lifetime risk projections are not adjusted for random and systematic errors. (Supported by U.S. NRC Grant NRC-04-091-02.) ^
Resumo:
Documented risks of physical activity include reduced bone mineral density at high activity volume, and sudden cardiac death among adults and adolescents. Further illumination of these risks is needed to inform future public health guidelines. The present research seeks to 1) quantify the association between physical activity and bone mineral density (BMD) across a broad range of activity volume, 2) assess the utility of an existing pre-screening questionnaire among US adults, and 3) determine if pre-screening risk stratification by questionnaire predicts referral to physician among Texas adolescents. ^ Among 9,468 adults 20 years of age or older in the National Health and Nutrition Examination Survey (NHANES) 2007-2010, linear regression analyses revealed generally higher BMD at the lumbar spine and proximal femur with greater reported activity volume. Only lumbar BMD in women was unassociated with activity volume. Among men, BMD was similar at activity beyond four times the minimum volume recommended in the Physical Activity Guidelines. These results suggest that the range of activity reported by US adults is not associated with low BMD at either site. ^ The American Heart Association / American College of Sports Medicine Preparticipation Questionnaire (AAPQ) was applied to 6,661 adults 40 years of age or older from NHANES 2001-2004 by using NHANES responses to complete AAPQ items. Following AAPQ referral criteria, 95.5% of women and 93.5% of men would be referred to a physician before exercise initiation, suggesting little utility for the AAPQ among adults aged 40 years or older. Unnecessary referral before exercise initiation may present a barrier to exercise adoption and may strain an already stressed healthcare infrastructure. ^ Among 3181 athletes in the Texas Adolescent Athlete Heart Screening Registry, 55.2% of boys and 62.2% of girls were classified as high-risk based on questionnaire answers. Using sex-stratified contingency table analyses, risk categories were not significantly associated with referral to physician based on electrocardiogram or echocardiogram, nor were they associated with confirmed diagnoses on follow-up. Additional research is needed to identify which symptoms are most closely related to sudden cardiac death, and determine the best methods for rapid and reliable assessment. ^ In conclusion, this research suggests that the volume of activity reported by US adults is not associated with low BMD at two clinically relevant sites, casts doubts on the utility of two existing cardiac screening tools, and raises concern about barriers to activity erected through ineffective screening. These findings augment existing research in this area that may inform revisions to the Physical Activity Guidelines regarding risk mitigation.^
Resumo:
Conservative procedures in low-dose risk assessment are used to set safety standards for known or suspected carcinogens. However, the assumptions upon which the methods are based and the effects of these methods are not well understood.^ To minimize the number of false-negatives and to reduce the cost of bioassays, animals are given very high doses of potential carcinogens. Results must then be extrapolated to much smaller doses to set safety standards for risks such as one per million. There are a number of competing methods that add a conservative safety factor into these calculations.^ A method of quantifying the conservatism of these methods was described and tested on eight procedures used in setting low-dose safety standards. The results using these procedures were compared by computer simulation and by the use of data from a large scale animal study.^ The method consisted of determining a "true safe dose" (tsd) according to an assumed underlying model. If one assumed that Y = the probability of cancer = P(d), a known mathematical function of the dose, then by setting Y to some predetermined acceptable risk, one can solve for d, the model's "true safe dose".^ Simulations were generated, assuming a binomial distribution, for an artificial bioassay. The eight procedures were then used to determine a "virtual safe dose" (vsd) that estimates the tsd, assuming a risk of one per million. A ratio R = ((tsd-vsd)/vsd) was calculated for each "experiment" (simulation). The mean R of 500 simulations and the probability R $<$ 0 was used to measure the over and under conservatism of each procedure.^ The eight procedures included Weil's method, Hoel's method, the Mantel-Byran method, the improved Mantel-Byran, Gross's method, fitting a one-hit model, Crump's procedure, and applying Rai and Van Ryzin's method to a Weibull model.^ None of the procedures performed uniformly well for all types of dose-response curves. When the data were linear, the one-hit model, Hoel's method, or the Gross-Mantel method worked reasonably well. However, when the data were non-linear, these same methods were overly conservative. Crump's procedure and the Weibull model performed better in these situations. ^