962 resultados para Biology, Biostatistics|Health Sciences, Nutrition|Health Sciences, Epidemiology|Health Sciences, Oncology
Resumo:
Analysis of recurrent events has been widely discussed in medical, health services, insurance, and engineering areas in recent years. This research proposes to use a nonhomogeneous Yule process with the proportional intensity assumption to model the hazard function on recurrent events data and the associated risk factors. This method assumes that repeated events occur for each individual, with given covariates, according to a nonhomogeneous Yule process with intensity function λx(t) = λ 0(t) · exp( x′β). One of the advantages of using a non-homogeneous Yule process for recurrent events is that it assumes that the recurrent rate is proportional to the number of events that occur up to time t. Maximum likelihood estimation is used to provide estimates of the parameters in the model, and a generalized scoring iterative procedure is applied in numerical computation. ^ Model comparisons between the proposed method and other existing recurrent models are addressed by simulation. One example concerning recurrent myocardial infarction events compared between two distinct populations, Mexican-American and Non-Hispanic Whites in the Corpus Christi Heart Project is examined. ^
Resumo:
Background. Diabetes places a significant burden on the health care system. Reduction in blood glucose levels (HbA1c) reduces the risk of complications; however, little is known about the impact of disease management programs on medical costs for patients with diabetes. In 2001, economic costs associated with diabetes totaled $100 billion, and indirect costs totaled $54 billion. ^ Objective. To compare outcomes of nurse case management by treatment algorithms with conventional primary care for glycemic control and cardiovascular risk factors in type 2 diabetic patients in a low-income Mexican American community-based setting, and to compare the cost effectiveness of the two programs. Patient compliance was also assessed. ^ Research design and methods. An observational group-comparison to evaluate a treatment intervention for type 2 diabetes management was implemented at three out-patient health facilities in San Antonio, Texas. All eligible type 2 diabetic patients attending the clinics during 1994–1996 became part of the study. Data were obtained from the study database, medical records, hospital accounting, and pharmacy cost lists, and entered into a computerized database. Three groups were compared: a Community Clinic Nurse Case Manager (CC-TA) following treatment algorithms, a University Clinic Nurse Case Manager (UC-TA) following treatment algorithms, and Primary Care Physicians (PCP) following conventional care practices at a Family Practice Clinic. The algorithms provided a disease management model specifically for hyperglycemia, dyslipidemia, hypertension, and microalbuminuria that progressively moved the patient toward ideal goals through adjustments in medication, self-monitoring of blood glucose, meal planning, and reinforcement of diet and exercise. Cost effectiveness of hemoglobin AI, final endpoints was compared. ^ Results. There were 358 patients analyzed: 106 patients in CC-TA, 170 patients in UC-TA, and 82 patients in PCP groups. Change in hemoglobin A1c (HbA1c) was the primary outcome measured. HbA1c results were presented at baseline, 6 and 12 months for CC-TA (10.4%, 7.1%, 7.3%), UC-TA (10.5%, 7.1%, 7.2%), and PCP (10.0%, 8.5%, 8.7%). Mean patient compliance was 81%. Levels of cost effectiveness were significantly different between clinics. ^ Conclusion. Nurse case management with treatment algorithms significantly improved glycemic control in patients with type 2 diabetes, and was more cost effective. ^
Resumo:
Anticancer drugs typically are administered in the clinic in the form of mixtures, sometimes called combinations. Only in rare cases, however, are mixtures approved as drugs. Rather, research on mixtures tends to occur after single drugs have been approved. The goal of this research project was to develop modeling approaches that would encourage rational preclinical mixture design. To this end, a series of models were developed. First, several QSAR classification models were constructed to predict the cytotoxicity, oral clearance, and acute systemic toxicity of drugs. The QSAR models were applied to a set of over 115,000 natural compounds in order to identify promising ones for testing in mixtures. Second, an improved method was developed to assess synergistic, antagonistic, and additive effects between drugs in a mixture. This method, dubbed the MixLow method, is similar to the Median-Effect method, the de facto standard for assessing drug interactions. The primary difference between the two is that the MixLow method uses a nonlinear mixed-effects model to estimate parameters of concentration-effect curves, rather than an ordinary least squares procedure. Parameter estimators produced by the MixLow method were more precise than those produced by the Median-Effect Method, and coverage of Loewe index confidence intervals was superior. Third, a model was developed to predict drug interactions based on scores obtained from virtual docking experiments. This represents a novel approach for modeling drug mixtures and was more useful for the data modeled here than competing approaches. The model was applied to cytotoxicity data for 45 mixtures, each composed of up to 10 selected drugs. One drug, doxorubicin, was a standard chemotherapy agent and the others were well-known natural compounds including curcumin, EGCG, quercetin, and rhein. Predictions of synergism/antagonism were made for all possible fixed-ratio mixtures, cytotoxicities of the 10 best-scoring mixtures were tested, and drug interactions were assessed. Predicted and observed responses were highly correlated (r2 = 0.83). Results suggested that some mixtures allowed up to an 11-fold reduction of doxorubicin concentrations without sacrificing efficacy. Taken together, the models developed in this project present a general approach to rational design of mixtures during preclinical drug development. ^
Resumo:
Drinking water-related exposures within populations living in the United States-Mexico border region, particularly among Hispanics, is an area that is largely unknown. Specifically, perceptions that may affect water source selection is an issue that has not been fully addressed. This study evaluates drinking water quality perceptions in a mostly Hispanic community living along the United States-Mexico border, a community also facing water scarcity issues. Using a survey that was administered during two seasons (winter and summer), data were collected from a total of 608 participants, of which 303 were living in the United States and 305 in Mexico. A (random) convenience sampling technique was used to select households and those interviewed were over 18 years of age. Statistically significant differences were observed involving country of residence (p=0.002). Specifically, those living in Mexico reported a higher use of bottled water than those living in the United States. Perception factors, especially taste, were cited as main reasons for not selecting unfiltered tap water as a primary drinking water source. Understanding what influences drinking water source preference can aid in the development of risk communication strategies regarding water quality. ^
Resumo:
Monte Carlo simulation has been conducted to investigate parameter estimation and hypothesis testing in some well known adaptive randomization procedures. The four urn models studied are Randomized Play-the-Winner (RPW), Randomized Pôlya Urn (RPU), Birth and Death Urn with Immigration (BDUI), and Drop-the-Loses Urn (DL). Two sequential estimation methods, the sequential maximum likelihood estimation (SMLE) and the doubly adaptive biased coin design (DABC), are simulated at three optimal allocation targets that minimize the expected number of failures under the assumption of constant variance of simple difference (RSIHR), relative risk (ORR), and odds ratio (OOR) respectively. Log likelihood ratio test and three Wald-type tests (simple difference, log of relative risk, log of odds ratio) are compared in different adaptive procedures. ^ Simulation results indicates that although RPW is slightly better in assigning more patients to the superior treatment, the DL method is considerably less variable and the test statistics have better normality. When compared with SMLE, DABC has slightly higher overall response rate with lower variance, but has larger bias and variance in parameter estimation. Additionally, the test statistics in SMLE have better normality and lower type I error rate, and the power of hypothesis testing is more comparable with the equal randomization. Usually, RSIHR has the highest power among the 3 optimal allocation ratios. However, the ORR allocation has better power and lower type I error rate when the log of relative risk is the test statistics. The number of expected failures in ORR is smaller than RSIHR. It is also shown that the simple difference of response rates has the worst normality among all 4 test statistics. The power of hypothesis test is always inflated when simple difference is used. On the other hand, the normality of the log likelihood ratio test statistics is robust against the change of adaptive randomization procedures. ^
Resumo:
Background. Research into methods for recovery from fatigue due to exercise is a popular topic among sport medicine, kinesiology and physical therapy. However, both the quantity and quality of studies and a clear solution of recovery are lacking. An analysis of the statistical methods in the existing literature of performance recovery can enhance the quality of research and provide some guidance for future studies. Methods: A literature review was performed using SCOPUS, SPORTDiscus, MEDLINE, CINAHL, Cochrane Library and Science Citation Index Expanded databases to extract the studies related to performance recovery from exercise of human beings. Original studies and their statistical analysis for recovery methods including Active Recovery, Cryotherapy/Contrast Therapy, Massage Therapy, Diet/Ergogenics, and Rehydration were examined. Results: The review produces a Research Design and Statistical Method Analysis Summary. Conclusion: Research design and statistical methods can be improved by using the guideline from the Research Design and Statistical Method Analysis Summary. This summary table lists the potential issues and suggested solutions, such as, sample size calculation, sports specific and research design issues consideration, population and measure markers selection, statistical methods for different analytical requirements, equality of variance and normality of data, post hoc analyses and effect size calculation.^
Resumo:
A cohort study was conducted in Texas and Louisiana Gulf Coast area on individual workers who have been exposed to asbestos for 15 years or more. Most of these workers were employed in petrochemical industries. Of the 15,742 subjects initially selected for the cohort study, 3,258 had positive chest X-ray findings believed to be related to prolonged asbestos exposure. These subjects were further investigated. Their work out included detailed medical and occupational history, laboratory tests and spirometry. One thousand eight-hundred and three cases with positive chest X-ray findings whose data files were considered complete at the end of May 1986 were analyzed and their findings included in this report.^ The prevalence of lung cancer and cancer of the following sights: skin, stomach, oropharyngeal, pancreas and kidneys were significantly increased when compared to data from Connecticut Tumor Registry. The prevalence of other chronic conditions such as hypertension, emphysema, heart disease and peptic ulcer was also significantly high when compared to data for the U.S. and general population furnished by the National Center for Health Statistics (NCHS). In most instances the occurrence of cancer and the chronic ailment previously mentioned appeared to follow 15-25 years of exposure to asbestos. ^
Resumo:
Standardization is a common method for adjusting confounding factors when comparing two or more exposure category to assess excess risk. Arbitrary choice of standard population in standardization introduces selection bias due to healthy worker effect. Small sample in specific groups also poses problems in estimating relative risk and the statistical significance is problematic. As an alternative, statistical models were proposed to overcome such limitations and find adjusted rates. In this dissertation, a multiplicative model is considered to address the issues related to standardized index namely: Standardized Mortality Ratio (SMR) and Comparative Mortality Factor (CMF). The model provides an alternative to conventional standardized technique. Maximum likelihood estimates of parameters of the model are used to construct an index similar to the SMR for estimating relative risk of exposure groups under comparison. Parametric Bootstrap resampling method is used to evaluate the goodness of fit of the model, behavior of estimated parameters and variability in relative risk on generated sample. The model provides an alternative to both direct and indirect standardization method. ^
Resumo:
The purpose of this study was to elucidate the relationship between mitral valve prolapse and stroke. A population-based historical cohort investigation was conducted among residents of Olmsted County, Minnesota who had an initial echocardiographic diagnosis of mitral valve prolapse from 1975 through 1989. This cohort (N = 1085) was followed for stroke outcomes using the resources of an operational medical record linkage system. There was an overall two-fold increase in the incidence of stroke among individuals with mitral valve prolapse relative to a standard population (standardized morbidity ratio = 2.12, 95% confidence limits = 1.33-3.21). When the data were partitioned by duration of follow-up from the diagnosis of mitral valve prolapse, or by the calendar years at echocardiographic diagnosis, respectively, the association between mitral valve prolapse and stroke was not modified. Mitral valve prolapse subjects 85 years and older were at highest increased risk of developing strokes relative to the general population (standardized morbidity ratio = 5.47, 95% confidence limits = 2.20-11.24). Coronary heart disease, atrial fibrillation, diabetes mellitus and hypertension, were unlikely to have confounded the association between mitral valve prolapse and stroke.^ The cumulative risk of first stroke among individuals initially diagnosed with mitral valve prolapse age 15 to 64 years, given survival to 15.2 years of follow-up, was 4.0%. The cumulative risk of first stroke among individuals initially diagnosed with mitral valve prolapse age 65 to 74 years, given survival to 11.2 years of follow-up, was 13.2%. The cumulative risk of first stroke among individuals initially diagnosed with mitral valve prolapse age 75 years and older, given survival to 6.7 years of follow-up, was 30.6%.^ Among individuals with mitral valve prolapse, age, diabetes, and atrial fibrillation were associated with an increased risk of stroke. Atrial fibrillation was associated with a four-fold rate of stroke and diabetes associated with a seven-fold rate of stroke.^ Findings from this research support the hypothesis that mitral valvular heart prolapse is linked with a stroke sequela. ^
Resumo:
In Conroe, Texas, 492 students ages 5 to 15 participated in a screening examination for cardiovascular risk factor study. Among 492 students, 141 elementary and junior high students participated in the present sub-study to investigate the effect of the number of recent life events on blood pressure and on body mass index. Using the elementary and junior high school Coddington scales, life events occurring in the past 12 months were measured for students ages 9 to 14 years, no significant differences in life events were observed by age and sex. The number of life events was not related to blood pressure but was positively correlated to body mass index in children and adolescents. ^
Resumo:
Radiotherapy has been a method of choice in cancer treatment for a number of years. Mathematical modeling is an important tool in studying the survival behavior of any cell as well as its radiosensitivity. One particular cell under investigation is the normal T-cell, the radiosensitivity of which may be indicative to the patient's tolerance to radiation doses.^ The model derived is a compound branching process with a random initial population of T-cells that is assumed to have compound distribution. T-cells in any generation are assumed to double or die at random lengths of time. This population is assumed to undergo a random number of generations within a period of time. The model is then used to obtain an estimate for the survival probability of T-cells for the data under investigation. This estimate is derived iteratively by applying the likelihood principle. Further assessment of the validity of the model is performed by simulating a number of subjects under this model.^ This study shows that there is a great deal of variation in T-cells survival from one individual to another. These variations can be observed under normal conditions as well as under radiotherapy. The findings are in agreement with a recent study and show that genetic diversity plays a role in determining the survival of T-cells. ^
Resumo:
The use of exercise electrocardiography (ECG) to detect latent coronary heart disease (CHD) is discouraged in apparently healthy populations because of low sensitivity. These recommendations however, are based on the efficacy of evaluation of ischemia (ST segment changes) with little regard for other measures of cardiac function that are available during exertion. The purpose of this investigation was to determine the association of maximal exercise hemodynamic responses with risk of mortality due to all-causes, cardiovascular disease (CVD), and coronary heart disease (CHD) in apparently healthy individuals. Study participants were 20,387 men (mean age = 42.2 years) and 6,234 women (mean age = 41.9 years) patients of a preventive medicine center in Dallas, TX examined between 1971 and 1989. During an average of 8.1 years of follow-up, there were 348 deaths in men and 66 deaths in women. In men, age-adjusted all-cause death rates (per 10,000 person years) across quartiles of maximal systolic blood pressure (SBP) (low to high) were: 18.2, 16.2, 23.8, and 24.6 (p for trend $<$0.001). Corresponding rates for maximal heart rate were: 28.9, 15.9, 18.4, and 15.1 (p trend $<$0.001). After adjustment for confounding variables including age, resting systolic pressure, serum cholesterol and glucose, body mass index, smoking status, physical fitness and family history of CVD, risks (and 95% confidence interval (CI)) of all-cause mortality for quartiles of maximal SBP, relative to the lowest quartile, were: 0.96 (0.70-1.33), 1.36 (1.01-1.85), and 1.37 (0.98-1.92) for quartiles 2-4 respectively. Similar risks for maximal heart rate were: 0.61 (0.44-0.85), 0.69 (0.51-0.93), and 0.60 (0.41-0.87). No associations were noted between maximal exercise rate-pressure product mortality. Similar results were seen for risk of CVD and CHD death. In women, similar trends in age-adjusted all-cause and CVD death rates across maximal SBP and heart rate categories were observed. Sensitivity of the exercise test in predicting mortality was enhanced when ECG results were evaluated together with maximal exercise SBP or heart rate with a concomitant decrease in specificity. Positive predictive values were not improved. The efficacy of the exercise test in predicting mortality in apparently healthy men and women was not enhanced by using maximal exercise hemodynamic responses. These results suggest that an exaggerated systolic blood pressure or an attenuated heart rate response to maximal exercise are risk factors for mortality in apparently healthy individuals. ^
Resumo:
The purpose of this study was to evaluate the adequacy of computerized vital records in Texas for conducting etiologic studies on neural tube defects (NTDs), using the revised and expanded National Centers for Health Statistics vital record forms introduced in Texas in 1989.^ Cases of NTDs (anencephaly and spina bifida) among Harris County (Houston) residents were identified from the computerized birth and death records for 1989-1991. The validity of the system was then measured against cases ascertained independently through medical records and death certificates. The computerized system performed poorly in its identification of NTDs, particularly for anencephaly, where the false positive rate was 80% with little or no improvement over the 3-year period. For both NTDs the sensitivity and predictive value positive of the tapes were somewhat higher for Hispanic than non-Hispanic mothers.^ Case control studies were conducted utilizing the tape set and the independently verified data set, using controls selected from the live birth tapes. Findings varied widely between the data sets. For example, the anencephaly odds ratio for Hispanic mothers (vs. non-Hispanic) was 1.91 (CI = 1.38-2.65) for the tape file, but 3.18 (CI = 1.81-5.58) for verified records. The odds ratio for diabetes was elevated for the tape set (OR = 3.33, CI = 1.67-6.66) but not for verified cases (OR = 1.09, CI = 0.24-4.96), among whom few mothers were diabetic. It was concluded that computerized tapes should not be solely relied on for NTD studies.^ Using the verified cases, Hispanic mother was associated with spina bifida, and Hispanic mother, teen mother, and previous pregnancy terminations were associated with anencephaly. Mother's birthplace, education, parity, and diabetes were not significant for either NTD.^ Stratified analyses revealed several notable examples of statistical interaction. For anencephaly, strong interaction was observed between Hispanic origin and trimester of first prenatal care.^ The prevalence was 3.8 per 10,000 live births for anencephaly and 2.0 for spina bifida (5.8 per 10,000 births for the combined categories). ^
Resumo:
The natural history of placebo treated travelers' diarrhea and the prognostic factors of recovery from diarrhea were evaluated using 9 groups of placebo treated subjects from 9 clinical trial studies conducted since 1975, for use as a historical control in the future clinical trial of antidiarrheal agents. All of these studies were done by the same group of investigators in one site (Guadalajara, Mexico). The studies are similar in terms of population, measured parameters, microbiologic identification of enteropathogens and definitions of parameters. The studies had two different durations of followup. In some studies, subjects were followed for two days, and in some they were followed for five days.^ Using definitions established by the Infectious Diseases society of America and the Food and Drug Administration, the following efficacy parameters were evaluated: Time to last unformed stool (TLUS), number of unformed stools post-initiation of placebo treatment for five consecutive days of followup, microbiologic cure, and improvement of diarrhea. Among the groups that were followed for five days, the mean TLUS ranged from 59.1 to 83.5 hours. Fifty percent to 78% had diarrhea lasting more than 48 hours and 25% had diarrhea more than five days. The mean number of unformed stools passed on the first day post-initiation of therapy ranged from 3.6 to 5.8 and, for the fifth day ranged from 0.5 to 1.5. By the end of followup, diarrhea improved in 82.6% to 90% of the subjects. Subjects with enterotoxigenic E. coli had 21.6% to 90.0% microbiologic cure; and subjects with shigella species experienced 14.3% to 60.0% microbiologic cure.^ In evaluating the prognostic factors of recovery from diarrhea (primary efficacy parameter in evaluating the efficacy of antidiarrheal agents against travelers' diarrhea). The subjects from five studies were pooled and the Cox proportional hazard model was used to evaluate the predictors of prolonged diarrhea. After adjusting for design characteristics of each trial, fever with a rate ratio (RR) of 0.40, presence of invasive pathogens with a RR of 0.41, presence of severe abdominal pain and cramps with a RR of 0.50, number of watery stools more than five with a RR of 0.60, and presence of non-invasive pathogens with a RR of 0.84 predicted a longer duration of diarrhea. Severe vomiting with a RR of 2.53 predicted a shorter duration of diarrhea. The number of soft stools, presence of fecal leukocytes, presence of nausea, and duration of diarrhea before enrollment were not associated with duration of diarrhea. ^
Resumo:
Pneumonia is a well-documented and common respiratory infection in patients with acute traumatic spinal cord injuries, and may recur during the course of acute care. Using data from the North American Clinical Trials Network (NACTN) for Spinal Cord Injury, the incidence, timing, and recurrence of pneumonia were analyzed. The two main objectives were (1) to investigate the time and potential risk factors for the first occurrence of pneumonia using the Cox Proportional Hazards model, and (2) to investigate pneumonia recurrence and its risk factors using a Counting Process model that is a generalization of the Cox Proportional Hazards model. The results from survival analysis suggested that surgery, intubation, American Spinal Injury Association (ASIA) grade, direct admission to a NACTN site and age (older than 65 or not) were significant risks for first event of pneumonia and multiple events of pneumonia. The significance of this research is that it has the potential to identify patients at the time of admission who are at high risk for the incidence and recurrence of pneumonia. Knowledge and the time of occurrence of pneumonias are important factors for the development of prevention strategies and may also provide some insights into the selection of emerging therapies that compromise the immune system. ^