818 resultados para clinical (human) or epidemiologic studies : risk factor assessment
Resumo:
Background: Arboviral diseases are major global public health threats. Yet, our understanding of infection risk factors is, with a few exceptions, considerably limited. A crucial shortcoming is the widespread use of analytical methods generally not suited for observational data - particularly null hypothesis-testing (NHT) and step-wise regression (SWR). Using Mayaro virus (MAYV) as a case study, here we compare information theory-based multimodel inference (MMI) with conventional analyses for arboviral infection risk factor assessment. Methodology/Principal Findings: A cross-sectional survey of anti-MAYV antibodies revealed 44% prevalence (n = 270 subjects) in a central Amazon rural settlement. NHT suggested that residents of village-like household clusters and those using closed toilet/latrines were at higher risk, while living in non-village-like areas, using bednets, and owning fowl, pigs or dogs were protective. The "minimum adequate" SWR model retained only residence area and bednet use. Using MMI, we identified relevant covariates, quantified their relative importance, and estimated effect-sizes (beta +/- SE) on which to base inference. Residence area (beta(Village) = 2.93 +/- 0.41; beta(Upland) = -0.56 +/- 0.33, beta(Riverbanks) = -2.37 +/- 0.55) and bednet use (beta = -0.95 +/- 0.28) were the most important factors, followed by crop-plot ownership (beta = 0.39 +/- 0.22) and regular use of a closed toilet/latrine (beta = 0.19 +/- 0.13); domestic animals had insignificant protective effects and were relatively unimportant. The SWR model ranked fifth among the 128 models in the final MMI set. Conclusions/Significance: Our analyses illustrate how MMI can enhance inference on infection risk factors when compared with NHT or SWR. MMI indicates that forest crop-plot workers are likely exposed to typical MAYV cycles maintained by diurnal, forest dwelling vectors; however, MAYV might also be circulating in nocturnal, domestic-peridomestic cycles in village-like areas. This suggests either a vector shift (synanthropic mosquitoes vectoring MAYV) or a habitat/habits shift (classical MAYV vectors adapting to densely populated landscapes and nocturnal biting); any such ecological/adaptive novelty could increase the likelihood of MAYV emergence in Amazonia.
Resumo:
Risk factors for development of multiple sclerosis (MS) are still a matter of debate. Latitude gradient, vitamin D deficiency and season of birth are among the most investigated environmental factors associated with the disease. Several international studies suggest that birth in spring is a substantial risk factor for MS. We investigated the season of birth as a potential risk for MS in different geographical regions of Brazil. We conducted a cross-sectional retrospective study with 2257 clinically definite MS patients enrolled in 13 Brazilian MS clinics in the south, southeast, and northeast regions of Brazil. Demographic and clinical data relating to date of birth and clinical features of the disease were collected and analysed, and subsequently compared with birth date among the general Brazilian population. The distribution of date of birth of MS patients showed an increase in spring and a decrease in autumn, with no difference being observed in the other seasons. In conclusion, season of birth is a probable risk factor for MS in most parts of Brazil. These findings may be related to the role that vitamin D plays in MS pathogenesis. © 2013 Elsevier B.V. All rights reserved.
Resumo:
Clinical studies indicate that exaggerated postprandial lipemia is linked to the progression of atherosclerosis, leading cause of Cardiovascular Diseases (CVD). CVD is a multi-factorial disease with complex etiology and according to the literature postprandial Triglycerides (TG) can be used as an independent CVD risk factor. Aim of the current study is to construct an Artificial Neural Network (ANN) based system for the identification of the most important gene-gene and/or gene-environmental interactions that contribute to a fast or slow postprandial metabolism of TG in blood and consequently to investigate the causality of postprandial TG response. The design and development of the system is based on a dataset of 213 subjects who underwent a two meals fatty prandial protocol. For each of the subjects a total of 30 input variables corresponding to genetic variations, sex, age and fasting levels of clinical measurements were known. Those variables provide input to the system, which is based on the combined use of Parameter Decreasing Method (PDM) and an ANN. The system was able to identify the ten (10) most informative variables and achieve a mean accuracy equal to 85.21%.
Resumo:
Genome-wide association studies (GWAS) have revealed genetic determinants of iron metabolism, but correlation of these with clinical phenotypes is pending. Homozygosity for HFE C282Y is the predominant genetic risk factor for hereditary hemochromatosis (HH) and may cause liver cirrhosis. However, this genotype has a low penetrance. Thus, detection of yet unknown genetic markers that identify patients at risk of developing severe liver disease is necessary for better prevention. Genetic loci associated with iron metabolism (TF, TMPRSS6, PCSK7, TFR2 and Chr2p14) in recent GWAS and liver fibrosis (PNPLA3) in recent meta-analysis were analyzed for association with either liver cirrhosis or advanced fibrosis in 148 German HFE C282Y homozygotes. Replication of associations was sought in additional 499 Austrian/Swiss and 112 HFE C282Y homozygotes from Sweden. Only variant rs236918 in the PCSK7 gene (proprotein convertase subtilisin/kexin type 7) was associated with cirrhosis or advanced fibrosis (P = 1.02 × 10(-5)) in the German cohort with genotypic odds ratios of 3.56 (95% CI 1.29-9.77) for CG heterozygotes and 5.38 (95% CI 2.39-12.10) for C allele carriers. Association between rs236918 and cirrhosis was confirmed in Austrian/Swiss HFE C282Y homozygotes (P = 0.014; ORallelic = 1.82 (95% CI 1.12-2.95) but not in Swedish patients. Post hoc combined analyses of German/Swiss/Austrian patients with available liver histology (N = 244, P = 0.00014, ORallelic = 2.84) and of males only (N = 431, P = 2.17 × 10(-5), ORallelic = 2.54) were consistent with the premier finding. Association between rs236918 and cirrhosis was not confirmed in alcoholic cirrhotics, suggesting specificity of this genetic risk factor for HH. PCSK7 variant rs236918 is a risk factor for cirrhosis in HH patients homozygous for the HFE C282Y mutation.
Resumo:
There is a growing concern within public health about mycotoxin involvement in human diseases, namely those related to children. The MycoMix project (2012-2015), funded by the Portuguese Foundation for Science and Technology, gathered a multidisciplinary team aiming at answering several questions: 1) Are Portuguese children exposed daily to one or several mycotoxins through food? 2) Can this co-exposure affect children´s health? and 3) Are there interaction effect between mycotoxins? Mycomix results revealed that Portuguese children (< 3 years old, n=103) are exposed to multiple mycotoxins through food consumption. Cumulative risk assessment results revealed a potential health concern for the high percentiles of intake, specially for aflatoxins which are carcinogenic compounds. This fact assumes particular importance considering the interactive effects found in in vitro bioassays. These results highlight the need for a more accurate approach to assess the human exposure to mycotoxins6. Within the Mycomix project the assessment of mycotoxin exposure was based on calculations combining mycotoxin data in food with population data on food consumption. This approach does not consider some aspects as the inter-individual metabolism variation, the exposure through sources other than food and the heterogeneous distribution of mycotoxins in food. Exposure assessment of mycotoxins in Portuguese population through biomarkers is still missing and further studies are urgent to be developed. The European Human Biomonitoring Initiative (EHBMI), a proposal within the European Joint Programme, aims to advance the understanding of the extent of exposure to environmental chemicals across Europe and the impact on human health, by gathering national expertise in human biomonitoring domain. At national level Mycomix project uncovered the potential health risk of exposure of Portuguese children to multiple mycotoxins. The risk assessment expertise acquired within Mycomix, namely in analysis and toxicology of chemical mixtures, will be brought together as a contribute to EHBMI objectives.
Resumo:
The growing knowledge of the genetic polymorphisms of enzymes metabolising xenobiotics in humans and their connections with individual susceptibility towards toxicants has created new and important interfaces between human epidemiology and experimental toxicology. The results of molecular epidemiological studies may provide new hypotheses and concepts, which call for experimental verification, and experimental concepts may obtain further proof by molecular epidemiological studies. If applied diligently, these possibilities may be combined to lead to new strategies of human-oriented toxicological research. This overview will present some outstanding examples for such strategies taken from the practically very important field of occupational toxicology. The main focus is placed on the effects of enzyme polymorphisms of the xenobiotic metabolism in association with the induction of bladder cancer and renal cell cancer after exposure to occupational chemicals. Also, smoking and induction of head and neck squamous cell cancer are considered.
Resumo:
Background Burden of disease estimates for South Africa have highlighted the particularly high rates of injuries related to interpersonal violence compared with other regions of the world, but these figures tell only part of the story. In addition to direct physical injury, violence survivors are at an increased risk of a wide range of psychological and behavioral problems. This study aimed to comprehensively quantify the excess disease burden attributable to exposure to interpersonal violence as a risk factor for disease and injury in South Africa. Methods The World Health Organization framework of interpersonal violence was adapted. Physical injury mortality and disability were categorically attributed to interpersonal violence. In addition, exposure to child sexual abuse and intimate partner violence, subcategories of interpersonal violence, were treated as risk factors for disease and injury using counterfactual estimation and comparative risk assessment methods. Adjustments were made to account for the combined exposure state of having experienced both child sexual abuse and intimate partner violence. Results Of the 17 risk factors included in the South African Comparative Risk Assessment study, interpersonal violence was the second leading cause of healthy years of life lost, after unsafe sex, accounting for 1.7 million disability-adjusted life years (DALYs) or 10.5% of all DALYs (95% uncertainty interval: 8.5%-12.5%) in 2000. In women, intimate partner violence accounted for 50% and child sexual abuse for 32% of the total attributable DALYs. Conclusions The implications of our findings are that estimates that include only the direct injury burden seriously underrepresent the full health impact of interpersonal violence. Violence is an important direct and indirect cause of health loss and should be recognized as a priority health problem as well as a human rights and social issue. This study highlights the difficulties in measuring the disease burden from interpersonal violence as a risk factor and the need to improve the epidemiological data on the prevalence and risks for the different forms of interpersonal violence to complete the picture. Given the extent of the burden, it is essential that innovative research be supported to identify social policy and other interventions that address both the individual and societal aspects of violence.
Resumo:
Several hypnosis monitoring systems based on the processed electroencephalogram (EEG) have been developed for use during general anesthesia. The assessment of the analgesic component (antinociception) of general anesthesia is an emerging field of research. This study investigated the interaction of hypnosis and antinociception, the association of several physiological variables with the degree of intraoperative nociception, and aspects of EEG Bispectral Index Scale (BIS) monitoring during general anesthesia. In addition, EEG features and heart rate (HR) responses during desflurane and sevoflurane anesthesia were compared. A propofol bolus of 0.7 mg/kg was more effective than an alfentanil bolus of 0.5 mg in preventing the recurrence of movement responses during uterine dilatation and curettage (D C) after a propofol-alfentanil induction, combined with nitrous oxide (N2O). HR and several HR variability-, frontal electromyography (fEMG)-, pulse plethysmography (PPG)-, and EEG-derived variables were associated with surgery-induced movement responses. Movers were discriminated from non-movers mostly by the post-stimulus values per se or normalized with respect to the pre-stimulus values. In logistic regression analysis, the best classification performance was achieved with the combination of normalized fEMG power and HR during D C (overall accuracy 81%, sensitivity 53%, specificity 95%), and with the combination of normalized fEMG-related response entropy, electrocardiography (ECG) R-to-R interval (RRI), and PPG dicrotic notch amplitude during sevoflurane anesthesia (overall accuracy 96%, sensitivity 90%, specificity 100%). ECG electrode impedances after alcohol swab skin pretreatment alone were higher than impedances of designated EEG electrodes. The BIS values registered with ECG electrodes were higher than those registered simultaneously with EEG electrodes. No significant difference in the time to home-readiness after isoflurane-N2O or sevoflurane-N2O anesthesia was found, when the administration of the volatile agent was guided by BIS monitoring. All other early and intermediate recovery parameters were also similar. Transient epileptiform EEG activity was detected in eight of 15 sevoflurane patients during a rapid increase in the inspired volatile concentration, and in none of the 16 desflurane patients. The observed transient EEG changes did not adversely affect the recovery of the patients. Following the rapid increase in the inhaled desflurane concentration, HR increased transiently, reaching its maximum in two minutes. In the sevoflurane group, the increase was slower and more subtle. In conclusion, desflurane may be a safer volatile agent than sevoflurane in patients with a lowered seizure threshold. The tachycardia induced by a rapid increase in the inspired desflurane concentration may present a risk for patients with heart disease. Designated EEG electrodes may be superior to ECG electrodes in EEG BIS monitoring. When the administration of isoflurane or sevoflurane is adjusted to maintain BIS values at 50-60 in healthy ambulatory surgery patients, the speed and quality of recovery are similar after both isoflurane-N2O and sevoflurane-N2O anesthesia. When anesthesia is maintained by the inhalation of N2O and bolus doses of propofol and alfentanil in healthy unparalyzed patients, movement responses may be best avoided by ensuring a relatively deep hypnotic level with propofol. HR/RRI, fEMG, and PPG dicrotic notch amplitude are potential indicators of nociception during anesthesia, but their performance needs to be validated in future studies. Combining information from different sources may improve the discrimination of the level of nociception.
Resumo:
BACKGROUND: Published work assessing psychosocial stress (job strain) as a risk factor for coronary heart disease is inconsistent and subject to publication bias and reverse causation bias. We analysed the relation between job strain and coronary heart disease with a meta-analysis of published and unpublished studies. METHODS: We used individual records from 13 European cohort studies (1985-2006) of men and women without coronary heart disease who were employed at time of baseline assessment. We measured job strain with questions from validated job-content and demand-control questionnaires. We extracted data in two stages such that acquisition and harmonisation of job strain measure and covariables occurred before linkage to records for coronary heart disease. We defined incident coronary heart disease as the first non-fatal myocardial infarction or coronary death. FINDINGS: 30?214 (15%) of 197?473 participants reported job strain. In 1·49 million person-years at risk (mean follow-up 7·5 years [SD 1·7]), we recorded 2358 events of incident coronary heart disease. After adjustment for sex and age, the hazard ratio for job strain versus no job strain was 1·23 (95% CI 1·10-1·37). This effect estimate was higher in published (1·43, 1·15-1·77) than unpublished (1·16, 1·02-1·32) studies. Hazard ratios were likewise raised in analyses addressing reverse causality by exclusion of events of coronary heart disease that occurred in the first 3 years (1·31, 1·15-1·48) and 5 years (1·30, 1·13-1·50) of follow-up. We noted an association between job strain and coronary heart disease for sex, age groups, socioeconomic strata, and region, and after adjustments for socioeconomic status, and lifestyle and conventional risk factors. The population attributable risk for job strain was 3·4%. INTERPRETATION: Our findings suggest that prevention of workplace stress might decrease disease incidence; however, this strategy would have a much smaller effect than would tackling of standard risk factors, such as smoking. FUNDING: Finnish Work Environment Fund, the Academy of Finland, the Swedish Research Council for Working Life and Social Research, the German Social Accident Insurance, the Danish National Research Centre for the Working Environment, the BUPA Foundation, the Ministry of Social Affairs and Employment, the Medical Research Council, the Wellcome Trust, and the US National Institutes of Health.
Resumo:
Background: Serious case reviews and research studies have indicated weaknesses in risk assessments conducted by child protection social workers. Social workers are adept at gathering information but struggle with analysis and assessment of risk. The Department for Education wants to know if the use of a structured decision-making tool can improve child protection assessments of risk.
Methods/design: This multi-site, cluster-randomised trial will assess the effectiveness of the Safeguarding Children Assessment and Analysis Framework (SAAF). This structured decision-making tool aims to improve social workers' assessments of harm, of future risk and parents' capacity to change. The comparison is management as usual.
Inclusion criteria: Children's Services Departments (CSDs) in England willing to make relevant teams available to be randomised, and willing to meet the trial's training and data collection requirements.
Exclusion criteria: CSDs where there were concerns about performance; where a major organisational restructuring was planned or under way; or where other risk assessment tools were in use.
Six CSDs are participating in this study. Social workers in the experimental arm will receive 2 days training in SAAF together with a range of support materials, and access to limited telephone consultation post-training. The primary outcome is child maltreatment. This will be assessed using data collected nationally on two key performance indicators: the first is the number of children in a year who have been subject to a second Child Protection Plan (CPP); the second is the number of re-referrals of children because of related concerns about maltreatment. Secondary outcomes are: i) the quality of assessments judged against a schedule of quality criteria and ii) the relationship between the three assessments required by the structured decision-making tool (level of harm, risk of (re) abuse and prospects for successful intervention).
Discussion: This is the first study to examine the effectiveness of SAAF. It will contribute to a very limited literature on the contribution that structured decision-making tools can make to improving risk assessment and case planning in child protection and on what is involved in their effective implementation.
Treatment intensification and risk factor control: toward more clinically relevant quality measures.
Resumo:
BACKGROUND: Intensification of pharmacotherapy in persons with poorly controlled chronic conditions has been proposed as a clinically meaningful process measure of quality. OBJECTIVE: To validate measures of treatment intensification by evaluating their associations with subsequent control in hypertension, hyperlipidemia, and diabetes mellitus across 35 medical facility populations in Kaiser Permanente, Northern California. DESIGN: Hierarchical analyses of associations of improvements in facility-level treatment intensification rates from 2001 to 2003 with patient-level risk factor levels at the end of 2003. PATIENTS: Members (515,072 and 626,130; age >20 years) with hypertension, hyperlipidemia, and/or diabetes mellitus in 2001 and 2003, respectively. MEASUREMENTS: Treatment intensification for each risk factor defined as an increase in number of drug classes prescribed, of dosage for at least 1 drug, or switching to a drug from another class within 3 months of observed poor risk factor control. RESULTS: Facility-level improvements in treatment intensification rates between 2001 and 2003 were strongly associated with greater likelihood of being in control at the end of 2003 (P < or = 0.05 for each risk factor) after adjustment for patient- and facility-level covariates. Compared with facility rankings based solely on control, addition of percentages of poorly controlled patients who received treatment intensification changed 2003 rankings substantially: 14%, 51%, and 29% of the facilities changed ranks by 5 or more positions for hypertension, hyperlipidemia, and diabetes, respectively. CONCLUSIONS: Treatment intensification is tightly linked to improved control. Thus, it deserves consideration as a process measure for motivating quality improvement and possibly for measuring clinical performance.
Resumo:
Background. This cross-sectional study was designed to evaluate the role of cigarette smoking and high-risk HPV types as risk factors of CIN 2 and 3 in young, sexually active Brazilian women. Materials and method. A series of 100 consecutive women with abnormal Pap smears were recruited, subjected to colposcopy, punch biopsy, and questionnaire for their social, sexual and reproductive factors. Of these, 77 women between 20 and 35 years of age (median 26.5 years) with biopsy-confirmed CIN 1 or CIN 2 and 3, were enrolled in this study. Representative samples from the exocervix and endocervix were obtained for HPV testing with the Hybrid Capture HPV-DNA assay, including the probes for the oncogenic HPV types (16, 18, 31, 33, 35, 45, 51, 52 and 56). Results. The overall rate of CIN 2 and 3 was 23/77 (29.8%). The women with CIN 1, 2 and 3 did not differ from each other with regard to their age, race, schooling, marital status, life-time number of sexual partners, age at first intercourse, use of oral contraceptives, or parity. However, current cigarette smoking was strongly associated with CIN 2 and 3 (p < 0,001), and among smokers, the risk of high-grade CIN increased in parallel with the time of exposure (years of smoking) p = 0.07), HPV-DNA of the oncogenic types was detected in 43 (56%) women, the risk of being HPV DNA-positive was significantly higher in CIN 2 and 3 as compared with CIN 1 (p = 0.037). Importantly, the prevalence of high-risk HPV types was significantly higher in cigarette smokers than in non-smokers (p = 0.046). Conclusions. The results indicate that the severity of CIN lesions was clearly related to two fundamental risk factors: 1) high-risk HPV types, and 2) current cigarette smoking. These two risk factors were closely interrelated in that the high-risk HPV types were significantly more frequent in current smokers than in non-smokers, suggesting the possibility of a synergistic action between these two risk factors in cervical carcinogenesis.
Resumo:
Background Post-transplant anemia is multifactorial and highly prevalent. Some studies have associated anemia with mortality and graft failure. The purpose of this study was to assess whether the presence of anemia at 1 year is an independent risk factor of mortality and graft survival. Methods All patients transplanted at a single center who survived at least 1 year after transplantation and showed no graft loss (n = 214) were included. Demographic and clinical data were collected at baseline and at 1 year. Patients were divided into two groups (anemic and nonanemic) based on the presence of anemia (hemoglobin<130 g/l in men and 120 g/l in women). Results Baseline characteristics such as age, gender, type of donor, CKD etiology, rejection, andmismatches were similar in both groups. Creatinine clearance was similar in both anemic and nonanemic groups (69.32 ± 29.8 × 75.69 ± 30.5 ml/mim; P = 0.17). A Kaplan- Meier plot showed significantly poorer death-censored graft survival in the anemic group, P = 0.003. Multivariate analysis revealed that anemic patients had a hazard ratio for the graft loss of 3.85 (95% CI: 1.49-9.96; P = 0.005). Conclusions In this study, anemia at 1 year was independently associated with death-censored graft survival and anemic patients were 3.8-fold more likely to lose the graft. © 2010 Springer Science+Business Media, B.V.
Resumo:
Microalbuminuria is an established risk factor for renal disease, especially in the diabetic population. Recent studies have shown that microalbuminuria has also a highly relevant predictive value for cardiovascular morbidity and mortality. From normal to overt proteinuria levels, albuminuria shows a continuous marked increase in cardiovascular risk. This association is independent of other "classical" cardiovascular risk factors such as hypertension, hyperlipidemia or smoking. Furthermore it has a predictive value not only for patients with diabetic or renal disease, but also for hypertensive individuals or the general population. Angiotensin-converting enzyme inhibitors and angiotensin receptor blockers have been shown to display not only reno--but also cardioprotective effects. Their unique ability to lower albuminuria by 40% is related to a significant risk reduction in cardiovascular mortality. New clinical trials are needed to define "normal" albuminuria levels and how low we should go.