68 resultados para High Risk
Resumo:
The purpose of this study was to investigate the occupational hazards within the tanning industry caused by contaminated dust. A qualitative assessment of the risk of human exposure to dust was made throughout a commercial Kenyan tannery. Using this information, high-risk points in the processing line were identified and dust sampling regimes developed. An optical set-up using microscopy and digital imaging techniques was used to determine dust particle numbers and size distributions. The results showed that chemical handling was the most hazardous (12 mg m(-3)). A Monte Carlo method was used to estimate the concentration of the dust in the air throughout the tannery during an 8 h working day. This showed that the high-risk area of the tannery was associated with mean concentrations of dust greater than the UK Statutory Instrument 2002 No. 2677. stipulated limits (exceeding 10 mg m(-3) (Inhalable dust limits) and 4 mg m(-3) (Respirable dust limits). This therefore has implications in terms of provision of personal protective equipment (PPE) to the tannery workers for the mitigation of occupational risk.
Resumo:
Predictive validity of the Stanford-Binet Intelligence Scale Fourth Edition (S-B IV) from age 3 years to ages 4-5 years was evaluated with biologically "at risk" children without major sensory or motor impairments (n = 236). Using the standard scoring, children with full scale IQ <or = 84 on the Wechsler Preschool and Primary Scale of Intelligence at age 4-5 years were poorly identified (sensitivity 54%) from the composite S-B IV score at age 3. However, sensitivity improved greatly to 78% by including as a predictor the number of subtests the child was actually able to perform at age 3 years. Measures from the Home Screening Questionnaire and ratings of mother-child interaction further improved sensitivity to 83%. The standard method for calculating the composite score on the S-B IV excludes subtests with a raw score of 0, which overestimates cognitive functioning in young biologically high risk children. Accuracy of early identification was improved significantly by considering the number of subtests the child did not perform at age 3 years.
Resumo:
Recent advances in corneal graft technology, including donor tissue retrieval, storage and surgical techniques, have greatly improved the clinical outcome of corneal grafts. Despite these advances, immune mediated corneal graft rejection remains the single most important cause of corneal graft failure. Several host factors have been identified as conferring a "high risk" status to the host. These include: more than two quadrant vascularisation, with associated lymphatics, which augment the afferent and efferent arc of the immune response; herpes simplex keratitis; uveitis; silicone oil keratopathy; previous failed (rejected) grafts; "hot eyes"; young recipient age; and multiple surgical procedures at the time of grafting. Large grafts, by virtue of being closer to the host limbus, with its complement of vessels and antigen-presenting Langerhans cells, also are more susceptible to rejection. The diagnosis of graft rejection is entirely clinical and in its early stages the clinical signs could be subtle. Graft rejection is largely mediated by the major histocompatibility antigens, minor antigens and perhaps blood group ABO antigens and some cornea-specific antigens. Just as rejection is mediated by active immune mediated events, the lack of rejection (tolerance) is also sustained by active immune regulatory mechanisms. The anterior chamber associated immune deviation (ACAID) and probably, conjunctiva associated lymphoid tissue (CALT) induced mucosal tolerance, besides others, play an important role. Although graft rejection can lead to graft failure, most rejections can be readily controlled if appropriate management is commenced at the proper time. Topical steroids are the mainstay of graft rejection management. In the high-risk situations however, systemic steroids, and other immunosuppressive drugs such as cyclosporin and tacrolimus (FK506) are of proven benefit, both for treatment and prevention of rejection.
Resumo:
Introduction: It has been suggested that doctors in their first year of post-graduate training make a disproportionate number of prescribing errors.
Obkective: This study aimed to compare the prevalence of prescribing errors made by first-year post-graduate doctors with that of errors by senior doctors and non-medical prescribers and to investigate the predictors of potentially serious prescribing errors.
Methods: Pharmacists in 20 hospitals over 7 prospectively selected days collected data on the number of medication orders checked, the grade of prescriber and details of any prescribing errors. Logistic regression models (adjusted for clustering by hospital) identified factors predicting the likelihood of prescribing erroneously and the severity of prescribing errors.
Results: Pharmacists reviewed 26,019 patients and 124,260 medication orders; 11,235 prescribing errors were detected in 10,986 orders. The mean error rate was 8.8 % (95 % confidence interval [CI] 8.6-9.1) errors per 100 medication orders. Rates of errors for all doctors in training were significantly higher than rates for medical consultants. Doctors who were 1 year (odds ratio [OR] 2.13; 95 % CI 1.80-2.52) or 2 years in training (OR 2.23; 95 % CI 1.89-2.65) were more than twice as likely to prescribe erroneously. Prescribing errors were 70 % (OR 1.70; 95 % CI 1.61-1.80) more likely to occur at the time of hospital admission than when medication orders were issued during the hospital stay. No significant differences in severity of error were observed between grades of prescriber. Potentially serious errors were more likely to be associated with prescriptions for parenteral administration, especially for cardiovascular or endocrine disorders.
Conclusions: The problem of prescribing errors in hospitals is substantial and not solely a problem of the most junior medical prescribers, particularly for those errors most likely to cause significant patient harm. Interventions are needed to target these high-risk errors by all grades of staff and hence improve patient safety.
Resumo:
BACKGROUND: Understanding the heterogeneous genotypes and phenotypes of prostate cancer is fundamental to improving the way we treat this disease. As yet, there are no validated descriptions of prostate cancer subgroups derived from integrated genomics linked with clinical outcome.
METHODS: In a study of 482 tumour, benign and germline samples from 259 men with primary prostate cancer, we used integrative analysis of copy number alterations (CNA) and array transcriptomics to identify genomic loci that affect expression levels of mRNA in an expression quantitative trait loci (eQTL) approach, to stratify patients into subgroups that we then associated with future clinical behaviour, and compared with either CNA or transcriptomics alone.
FINDINGS: We identified five separate patient subgroups with distinct genomic alterations and expression profiles based on 100 discriminating genes in our separate discovery and validation sets of 125 and 103 men. These subgroups were able to consistently predict biochemical relapse (p = 0.0017 and p = 0.016 respectively) and were further validated in a third cohort with long-term follow-up (p = 0.027). We show the relative contributions of gene expression and copy number data on phenotype, and demonstrate the improved power gained from integrative analyses. We confirm alterations in six genes previously associated with prostate cancer (MAP3K7, MELK, RCBTB2, ELAC2, TPD52, ZBTB4), and also identify 94 genes not previously linked to prostate cancer progression that would not have been detected using either transcript or copy number data alone. We confirm a number of previously published molecular changes associated with high risk disease, including MYC amplification, and NKX3-1, RB1 and PTEN deletions, as well as over-expression of PCA3 and AMACR, and loss of MSMB in tumour tissue. A subset of the 100 genes outperforms established clinical predictors of poor prognosis (PSA, Gleason score), as well as previously published gene signatures (p = 0.0001). We further show how our molecular profiles can be used for the early detection of aggressive cases in a clinical setting, and inform treatment decisions.
INTERPRETATION: For the first time in prostate cancer this study demonstrates the importance of integrated genomic analyses incorporating both benign and tumour tissue data in identifying molecular alterations leading to the generation of robust gene sets that are predictive of clinical outcome in independent patient cohorts.
Resumo:
BACKGROUND: Acute promyelocytic leukaemia is a chemotherapy-sensitive subgroup of acute myeloid leukaemia characterised by the presence of the PML-RARA fusion transcript. The present standard of care, chemotherapy and all-trans retinoic acid (ATRA), results in a high proportion of patients being cured. In this study, we compare a chemotherapy-free ATRA and arsenic trioxide treatment regimen with the standard chemotherapy-based regimen (ATRA and idarubicin) in both high-risk and low-risk patients with acute promyelocytic leukaemia.
METHODS: In the randomised, controlled, multicentre, AML17 trial, eligible patients (aged ≥16 years) with acute promyelocytic leukaemia, confirmed by the presence of the PML-RARA transcript and without significant cardiac or pulmonary comorbidities or active malignancy, and who were not pregnant or breastfeeding, were enrolled from 81 UK hospitals and randomised 1:1 to receive treatment with ATRA and arsenic trioxide or ATRA and idarubicin. ATRA was given to participants in both groups in a daily divided oral dose of 45 mg/m(2) until remission, or until day 60, and then in a 2 weeks on-2 weeks off schedule. In the ATRA and idarubicin group, idarubicin was given intravenously at 12 mg/m(2) on days 2, 4, 6, and 8 of course 1, and then at 5 mg/m(2) on days 1-4 of course 2; mitoxantrone at 10 mg/m(2) on days 1-4 of course 3, and idarubicin at 12 mg/m(2) on day 1 of the final (fourth) course. In the ATRA and arsenic trioxide group, arsenic trioxide was given intravenously at 0·3 mg/kg on days 1-5 of each course, and at 0·25 mg/kg twice weekly in weeks 2-8 of course 1 and weeks 2-4 of courses 2-5. High-risk patients (those presenting with a white blood cell count >10 × 10(9) cells per L) could receive an initial dose of the immunoconjugate gemtuzumab ozogamicin (6 mg/m(2) intravenously). Neither maintenance treatment nor CNS prophylaxis was given to patients in either group. All patients were monitored by real-time quantitative PCR. Allocation was by central computer minimisation, stratified by age, performance status, and de-novo versus secondary disease. The primary endpoint was quality of life on the European Organisation for Research and Treatment of Cancer (EORTC) QLQ-C30 global health status. All analyses are by intention to treat. This trial is registered with the ISRCTN registry, number ISRCTN55675535.
FINDINGS: Between May 8, 2009, and Oct 3, 2013, 235 patients were enrolled and randomly assigned to ATRA and idarubicin (n=119) or ATRA and arsenic trioxide (n=116). Participants had a median age of 47 years (range 16-77; IQR 33-58) and included 57 high-risk patients. Quality of life did not differ significantly between the treatment groups (EORTC QLQ-C30 global functioning effect size 2·17 [95% CI -2·79 to 7·12; p=0·39]). Overall, 57 patients in the ATRA and idarubicin group and 40 patients in the ATRA and arsenic trioxide group reported grade 3-4 toxicities. After course 1 of treatment, grade 3-4 alopecia was reported in 23 (23%) of 98 patients in the ATRA and idarubicin group versus 5 (5%) of 95 in the ATRA and arsenic trioxide group, raised liver alanine transaminase in 11 (10%) of 108 versus 27 (25%) of 109, oral toxicity in 22 (19%) of 115 versus one (1%) of 109. After course 2 of treatment, grade 3-4 alopecia was reported in 25 (28%) of 89 patients in the ATRA and idarubicin group versus 2 (3%) of 77 in the ATRA and arsenic trioxide group; no other toxicities reached the 10% level. Patients in the ATRA and arsenic trioxide group had significantly less requirement for most aspects of supportive care than did those in the ATRA and idarubicin group.
INTERPRETATION: ATRA and arsenic trioxide is a feasible treatment in low-risk and high-risk patients with acute promyelocytic leukaemia, with a high cure rate and less relapse than, and survival not different to, ATRA and idarubicin, with a low incidence of liver toxicity. However, no improvement in quality of life was seen.
Resumo:
Autologous stem cell transplantation (ASCT) consolidation remains the treatment of choice for patients with relapsed diffuse large B cell lymphoma. The impact of rituximab combined with chemotherapy in either first- or second-line therapy on the ultimate results of ASCT remains to be determined, however. This study was designed to evaluate the benefit of ASCT in patients achieving a second complete remission after salvage chemotherapy by retrospectively comparing the disease-free survival (DFS) after ASCT for each patient with the duration of the first complete remission (CR1). Between 1990 and 2005, a total of 470 patients who had undergone ASCT and reported to the European Blood and Bone Transplantation Registry with Medical Essential Data Form B information were evaluated. Of these 470 patients, 351 (74%) had not received rituximab before ASCT, and 119 (25%) had received rituximab before ASCT. The median duration of CR1 was 11 months. The median time from diagnosis to ASCT was 24 months. The BEAM protocol was the most frequently used conditioning regimen (67%). After ASCT, the 5-year overall survival was 63% (95% confidence interval, 58%-67%) and 5-year DFS was 48% (95% confidence interval, 43%-53%) for the entire patient population. Statistical analysis showed a significant increase in DFS after ASCT compared with duration of CR1 (median, 51 months versus 11 months; P < .001). This difference was also highly significant for patients with previous exposure to rituximab (median, 10 months versus not reached; P < .001) and for patients who had experienced relapse before 1 year (median, 6 months versus 47 months; P < .001). Our data indicate that ASCT can significantly increase DFS compared with the duration of CR1 in relapsed diffuse large B cell lymphoma and can alter the disease course even in patients with high-risk disease previously treated with rituximab.
Resumo:
The identification of subjects at high risk for Alzheimer’s disease is important for prognosis and early intervention. We investigated the polygenic architecture of Alzheimer’s disease and the accuracy of Alzheimer’s disease prediction models, including and excluding the polygenic component in the model. This study used genotype data from the powerful dataset comprising 17 008 cases and 37 154 controls obtained from the International Genomics of Alzheimer’s Project (IGAP). Polygenic score analysis tested whether the alleles identified to associate with disease in one sample set were significantly enriched in the cases relative to the controls in an independent sample. The disease prediction accuracy was investigated in a subset of the IGAP data, a sample of 3049 cases and 1554 controls (for whom APOE genotype data were available) by means of sensitivity, specificity, area under the receiver operating characteristic curve (AUC) and positive and negative predictive values. We observed significant evidence for a polygenic component enriched in Alzheimer’s disease (P = 4.9 × 10−26). This enrichment remained significant after APOE and other genome-wide associated regions were excluded (P = 3.4 × 10−19). The best prediction accuracy AUC = 78.2% (95% confidence interval 77–80%) was achieved by a logistic regression model with APOE, the polygenic score, sex and age as predictors. In conclusion, Alzheimer’s disease has a significant polygenic component, which has predictive utility for Alzheimer’s disease risk and could be a valuable research tool complementing experimental designs, including preventative clinical trials, stem cell selection and high/low risk clinical studies. In modelling a range of sample disease prevalences, we found that polygenic scores almost doubles case prediction from chance with increased prediction at polygenic extremes.