830 resultados para high risk population


Relevância:

90.00% 90.00%

Publicador:

Resumo:

A total of 167 sheep belonging to the Estonian whiteheaded mutton, Estonian blackheaded mutton, Lithuanian coarsewool native, Lithuanian blackface and Latvian darkheaded mutton breeds, and a population of sheep kept isolated on the Estonian island of Ruhnu, were sequence-analysed for polymorphisms in the prion protein (PrP) gene, to determine their genotype and the allele frequencies of polymorphisms in PrP known to confer resistance to scrapie. A 939 base pair fragment of exon 3 from the PrP gene was amplified by pcr and analysed by direct sequencing. For animals showing polymorphism at two nucleotide positions, both haplotypes of these double-heterozygous genotypes were further verified by pcr cloning and sequence analysis. Known polymorphisms were observed at codons 136, 154 and 171, and six different haplotypes (arr, ahq, arh, ahr, arq and vrq) were determined. On the basis of these polymorphisms, the six populations of sheep possessed the resistant arr haplotype at different frequencies. The high-risk arq haplotype occurred in high frequencies in all six populations, but vrq, the haplotype carrying the highest risk, occurred at low frequencies and in only three of the populations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: We sought to determine whether a high-risk group could be defined among patients with operable breast cancer in whom a search of occult central nervous system (CNS) metastases was justified. PATIENTS AND METHODS: We evaluated data from 9524 women with early breast cancer (42% node-negative) who were randomized in International Breast Cancer Study Group clinical trials between 1978 and 1999, and treated without anthracyclines, taxanes, or trastuzumab. We identified patients whose site of first event was CNS and those who had a CNS event at any time. RESULTS: Median follow-up was 13 years. The 10-year incidence (10-yr) of CNS relapse was 5.2% (1.3% as first recurrence). Factors predictive of CNS as first recurrence included: node-positive disease (10-yr = 2.2% for > 3 N+), estrogen receptor-negative (2.3%), tumor size > 2 cm (1.7%), tumor grade 3 (2.0%), < 35 years old (2.2%), HER2-positive (2.7%), and estrogen receptor-negative and node-positive (2.6%). The risk of subsequent CNS recurrence was elevated in patients experiencing lung metastases (10-yr = 16.4%). CONCLUSION: Based on this large cohort we were able to define risk factors for CNS metastases, but could not define a group at sufficient risk to justify routine screening for occult CNS metastases.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Studies of high-altitude populations, and in particular of maladapted subgroups, may provide important insight into underlying mechanisms involved in the pathogenesis of hypoxemia-related disease states in general. Over the past decade, studies involving short-term hypoxic exposure have greatly advanced our knowledge regarding underlying mechanisms and predisposing events of hypoxic pulmonary hypertension. Studies in high altitude pulmonary edema (HAPE)-prone subjects, a condition characterized by exaggerated hypoxic pulmonary hypertension, have provided evidence for the central role of pulmonary vascular endothelial and respiratory epithelial nitric oxide (NO) for pulmonary artery pressure homeostasis. More recently, it has been shown that pathological events during the perinatal period (possibly by impairing pulmonary NO synthesis), predispose to exaggerated hypoxic pulmonary hypertension later in life. In an attempt to translate some of this new knowledge to the understanding of underlying mechanisms and predisposing events of chronic hypoxic pulmonary hypertension, we have recently initiated a series of studies among high-risk subpopulations (experiments of nature) of high-altitude dwellers. These studies have allowed to identify novel risk factors and underlying mechanisms that may predispose to sustained hypoxic pulmonary hypertension. The aim of this article is to briefly review this new data, and demonstrate that insufficient NO synthesis/bioavailability, possibly related in part to augmented oxidative stress, may represent an important underlying mechanism predisposing to pulmonary hypertension in high-altitude dwellers.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

PURPOSE: The aim of this study was to determine occupational facial fractures in central Switzerland. Concomitant injuries were also studied. MATERIALS AND METHODS: The Department of Cranio-Maxillofacial Surgery at the University Hospital in Berne provides a 24-hour maxillofacial trauma service for its population (1.6 million). The present study was comprised of 42 patients (8.4% of treated maxillofacial injuries) with occupational maxillofacial fractures registered at this unit between 2000 and 2002. Information on the topic of occupation, the cause of the accidents, and the topographic location of the fractures was analyzed. RESULTS: The mean age of the patients was 44.4 years, with a male to female ratio of 41:1. Sixty-nine percent of the injuries occurred in farm and forestry workers and in construction laborers during the summertime (33%). Workers in these occupations carried a 127-fold (farm and forestry workers) and a 44-fold (construction laborers) higher risk of incurring maxillofacial fractures than did service and office workers. Injuries were most frequently (43%) caused by a thrown, projected, or falling object. Eighty-two percent of the fractures occurred in the midface region and at the skull base. Fifty-nine percent of the patients had concomitant injuries. In 69%, surgery was necessary, the mean duration of their hospital stay being 4.8 days. CONCLUSION: The probability of sustaining work-related maxillofacial traumata is correlated to the nature of the occupation. Farm and forestry workers are at the highest risk, most frequently injured by being struck by an object or an animal. The introduction of personalized safety measures should become obligatory in high-risk occupations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: This is the first study investigating neoadjuvant interstitial high-dose-rate (HDR) brachytherapy combined with chemotherapy in patients with breast cancer. The goal was to evaluate the type of surgical treatment, histopathologic response, side effects, local control, and survival. PATIENTS AND METHODS: 53 patients, who could not be treated with breast-conserving surgery due to initial tumor size (36/53) or due to an unfavorable breast-tumor ratio (17/53), were analyzed retrospectively. All but one were in an intermediate/high-risk group (St. Gallen criteria). The patients received a neoadjuvant protocol consisting of systemic chemotherapy combined with fractionated HDR brachytherapy (2 x 5 Gy/day, total dose 30 Gy). In cases, where breast-conserving surgery was performed, patients received additional external-beam radiotherapy (EBRT, 1.8 Gy/day, total dose 50.4 Gy). In patients, who underwent mastectomy but showed an initial tumor size of T3/T4 and/or more than three infiltrated lymph nodes, EBRT was also performed. RESULTS: In 30/53 patients (56.6%) breast-conserving surgery could be performed. The overall histopathologic response rate was 96.2% with a complete remission in 28.3% of patients. 49/53 patients were evaluable for follow-up. After a median of 58 months (45-72 months), one patient showed a mild fibrosis of the breast tissue, three patients had mild to moderate lymphatic edema of the arm. 6/49 (12.2%) patients died of distant metastases, 4/49 (8.2%) were alive with disease, and 39/49 (79.6%) were free from disease. Local recurrence was observed in only one case (2%) 40 months after primary therapy. After mastectomy, this patient is currently free from disease. CONCLUSION: The combination of interstitial HDR brachytherapy and chemotherapy is a well-tolerated and effective neoadjuvant treatment in patients with breast cancer. Compared to EBRT, treatment time is short. Postoperative EBRT of the whole breast -- if necessary -- is still possible after neoadjuvant brachytherapy. Even though the number of patients does not permit definite conclusions, the results are promising regarding survival and the very low rate of local recurrences.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: The epidemiology of liver disease in patients admitted to emergency rooms is largely unknown. The current study aimed to measure the prevalence of viral hepatitis B and C infection and pathological laboratory values of liver disease in such a population, and to study factors associated with these measurements. METHODS: Cross-sectional study in patients admitted to the emergency room of a university hospital. No formal exclusion criteria. Determination of anti-HBs, anti-HCV, transferrin saturation, alanine aminotransferase, and obtaining answers from a study-specific questionnaire. RESULTS: The study included 5'036 patients, representing a 14.9% sample of the target population during the study period. Prevalence of anti-HBc and anti-HCV was 6.7% (95%CI 6.0% to 7.4%) and 2.7% (2.3% to 3.2%), respectively. Factors independently associated with positive anti-HBc were intravenous drug abuse (OR 18.3; 11.3 to 29.7), foreign country of birth (3.4; 2.6 to 4.4), non-white ethnicity (2.7; 1.9 to 3.8) and age > or =60 (2.0; 1.5 to 2.8). Positive anti-HCV was associated with intravenous drug abuse (78.9; 43.4 to 143.6), blood transfusion (1.7; 1.1 to 2.8) and abdominal pain (2.7; 1.5 to 4.8). 75% of all participants were not vaccinated against hepatitis B or did not know their vaccination status. Among anti-HCV positive patients only 49% knew about their infection and 51% reported regular alcohol consumption. Transferrin saturation was elevated in 3.3% and was associated with fatigue (prevalence ratio 1.9; 1.2 to 2.8). CONCLUSION: Emergency rooms should be considered as targets for public health programs that encourage vaccination, patient education and screening of high-risk patients for liver disease with subsequent referral for treatment if indicated.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Early catheter-related infection is a serious complication in cancer treatment, although risk factors for its occurrence are not well established. The authors conducted a prospective study to identify the risk factors for developing early catheter-related infection. METHODS: All consecutive patients with cancer who underwent insertion of a central venous catheter were enrolled and were followed prospectively during 1 month. The study endpoint was occurrence of early catheter-related infection. RESULTS: Over 10,392 catheter-days of follow-up, 14 of 371 patients had early catheter-related infections (14 patients in 10,392 catheter-days or 1.34 per 1000 catheter-days). The causative pathogens were gram positive in 11 of 14 patients. In univariate analysis, the risk factors for early catheter-related infection were aged <10 years (P = .0001), difficulties during insertion (P < 10(-6)), blood product administration (P < 10(-3)), parenteral nutrition (P < 10(-4)), and use >2 days (P < 10(-6)). In multivariate analysis, 3 variables remained significantly associated with the risk of early catheter-related infection: age <10 years (odds ratio [OR], 18.4; 95% confidence interval [95% CI], 1.9-106.7), difficulties during insertion procedure (OR, 25.6; 95% CI, 4.2-106), and parenteral nutrition (OR, 28.5; 95% CI, 4.2-200). CONCLUSIONS: On the day of insertion, 2 variables were identified that were associated with a high risk of developing an early catheter-related infection: young age and difficulties during insertion. The results from this study may be used to identify patients who are at high risk of infection who may be candidates for preventive strategies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Screening programmes are promoted to control transmission of and prevent female reproductive tract morbidity caused by genital chlamydia. The objective of this study was to examine the effectiveness of register-based and opportunistic chlamydia screening interventions. METHODS: We searched seven electronic databases (Cinahl, Cochrane Controlled Trials Register, DARE, Embase, Medline, PsycINFO and SIGLE) without language restrictions from January 1990 to October 2007 and reference lists of retrieved articles to identify studies published before 1990. We included studies examining primary outcomes (pelvic inflammatory disease, ectopic pregnancy, infertility, adverse pregnancy outcomes, neonatal infection, chlamydia prevalence) and harms of chlamydia screening in men and non-pregnant and pregnant women. We extracted data in duplicate and synthesized the data narratively or used random effects meta-analysis, where appropriate. RESULTS: We included six systematic reviews, five randomized trials, one non-randomized comparative study and one time trend study. Five reviews recommended screening of women at high risk of chlamydia. Two randomized trials found that register-based screening of women at high risk of chlamydia and of female and male high school students reduced the incidence of pelvic inflammatory disease in women at 1 year. Methodological inadequacies could have overestimated the observed benefits. One randomized trial showed that opportunistic screening in women undergoing surgical termination of pregnancy reduced post-abortal rates of pelvic inflammatory disease compared with no screening. We found no randomized trials showing a benefit of opportunistic screening in other populations, no trial examining the effects of more than one screening round and no trials examining the harms of chlamydia screening. CONCLUSION: There is an absence of evidence supporting opportunistic chlamydia screening in the general population younger than 25 years, the most commonly recommended approach. Equipoise remains, so high-quality randomized trials of multiple rounds of screening with biological outcome measures are still needed to determine the balance of benefits and harms of chlamydia screening.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Invasive plant species threaten natural areas by reducing biodiversity and altering ecosystem functions. They also impact agriculture by reducing crop and livestock productivity. Millions of dollars are spent on invasive species control each year, and traditionally, herbicides are used to manage invasive species. Herbicides have human and environmental health risks associated with them; therefore, it is essential that land managers and stakeholders attempt to reduce these risks by utilizing the principles of integrated weed management. Integrated weed management is a practice that incorporates a variety of measures and focuses on the ecology of the invasive plant to manage it. Roadways are high risk areas that have high incidence of invasive species. Roadways act as conduits for invasive species spread and are ideal harborages for population growth; therefore, roadways should be a primary target for invasive species control. There are four stages in the invasion process which an invasive species must overcome: transport, establishment, spread, and impact. The aim of this dissertation was to focus on these four stages and examine the mechanisms underlying the progression from one stage to the next, while also developing integrated weed management strategies. The target species were Phragmites australis, common reed, and Cisrium arvense, Canada thistle. The transport and establishment risks of P. australis can be reduced by removing rhizome fragments from soil when roadside maintenance is performed. The establishment and spread of C. arvense can be reduced by planting particular resistant species, e.g. Heterotheca villosa, especially those that can reduce light transmittance to the soil. Finally, the spread and impact of C. arvense can be mitigated on roadsides through the use of the herbicide aminopyralid. The risks associated with herbicide drift produced by application equipment can be reduced by using the Wet-Blade herbicide application system.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Volcanoes pose a threat to the human population at regional and global scales and so efficient monitoring is essential in order to effectively manage and mitigate the risks that they pose. Volcano monitoring from space has been possible for over thirty years and now, more than ever, a suite of instruments exists with the capability to observe emissions of gas and ash from a unique perspective. The goal of this research is to demonstrate the use of a range of satellite-based sensors in order to detect and quantify volcanic sulphur dioxide, and to assess the relative performances of each sensor against one another. Such comparisons are important in order to standardise retrievals and permit better estimations of the global contribution of sulphur dioxide to the atmosphere from volcanoes for climate modelling. In this work, retrievals of volcanic sulphur dioxide from a number of instruments are compared, and the individual performances at quantifying emissions from large, explosive volcanic eruptions are assessed. Retrievals vary widely from sensor to sensor, and often the use of a number of sensors in synergy can provide the most complete picture, rather than just one instrument alone. Volcanic emissions have the ability to result significant economic loses by grounding aircraft due to the high risk associated with ash encountering aircraft. As sulphur dioxide is often easier to measure than ash, it is often used as a proxy. This work examines whether this is a reasonable assumption, using the Icelandic eruption in early 2010 as a case study. Results indicate that although the two species are for the most part collocated, separation can occur under some conditions, meaning that it is essential to accurately measure both species in order to provide effective hazard mitigation. Finally, the usefulness of satellite remote sensing in quantifying the passive degassing from Turrialba, Costa Rica is demonstrated. The increase in activity from 2005 – 2010 can be observed in satellite data prior to the phreatic phase of early 2010, and can therefore potentially provide a useful indication of changing activity at some volcanoes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Diabetic patients with acute coronary syndromes (ACSs) are at a high risk for subsequent cardiovascular events but derive, at the same time, greater benefit from evidence-based therapy than non-diabetic individuals. State-of-the-art anti-thrombotic therapy includes a triple anti-platelet combination - aspirin, clopidogrel and glycoprotein (GP) IIb/IIIa receptor inhibitors - and unfractionated heparin or enoxaparin. For low- or medium-risk individuals, a treatment based on aspirin, clopidogrel and bivalirudin is a valuable alternative. Prasugrel, a new and more potent inhibitor of the platelet P2Y(12) receptor, has to be regarded as the most promising anti-thrombotic agent for diabetic patients with ACS. This agent may replace clopidogrel - and possibly GP IIb/IIIa inhibitors - in the future. In addition to aggressive anti-thrombotic therapy, diabetic patients should undergo systematic early invasive angiography if presenting with non-ST-segment elevation ACS, and immediate percutaneous coronary intervention if presenting with ST-segment elevation myocardial infarction. Indeed, the benefit derived from these strategies appears to be more pronounced in the diabetic population than in non-diabetic individuals. Despite the benefit, multiple surveys have demonstrated that, in the setting of ACS, diabetic patients receive evidence-based therapy less frequently than non-diabetic counterparts.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Transcatheter aortic valve implantation (TAVI) for high-risk and inoperable patients with severe aortic stenosis is an emerging procedure in cardiovascular medicine. Little is known of the impact of TAVI on renal function. METHODS: We analysed retrospectively renal baseline characteristics and outcome in 58 patients including 2 patients on chronic haemodialysis undergoing TAVI at our institution. Acute kidney injury (AKI) was defined according to the RIFLE classification. RESULTS: Fifty-eight patients with severe symptomatic aortic stenosis not considered suitable for conventional surgical valve replacement with a mean age of 83 +/- 5 years underwent TAVI. Two patients died during transfemoral valve implantation and two patients in the first month after TAVI resulting in a 30-day mortality of 6.9%. Vascular access was transfemoral in 46 patients and transapical in 12. Estimated glomerular filtration rate (eGFR) increased in 30 patients (56%). Fifteen patients (28%) developed AKI, of which four patients had to be dialyzed temporarily and one remained on chronic renal replacement therapy. Risk factors for AKI comprised, among others, transapical access, number of blood transfusions, postinterventional thrombocytopaenia and severe inflammatory response syndrome (SIRS). CONCLUSIONS: TAVI is feasible in patients with a high burden of comorbidities and in patients with pre-existing end-stage renal disease who would be otherwise not considered as candidates for conventional aortic valve replacement. Although GFR improved in more than half of the patients, this benefit was associated with a risk of postinterventional AKI. Future investigations should define preventive measures of peri-procedural kidney injury.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: A complete remission is essential for prolonging survival in patients with acute myeloid leukemia (AML). Daunorubicin is a cornerstone of the induction regimen, but the optimal dose is unknown. In older patients, it is usual to give daunorubicin at a dose of 45 to 50 mg per square meter of body-surface area. METHODS: Patients in whom AML or high-risk refractory anemia had been newly diagnosed and who were 60 to 83 years of age (median, 67) were randomly assigned to receive cytarabine, at a dose of 200 mg per square meter by continuous infusion for 7 days, plus daunorubicin for 3 days, either at the conventional dose of 45 mg per square meter (411 patients) or at an escalated dose of 90 mg per square meter (402 patients); this treatment was followed by a second cycle of cytarabine at a dose of 1000 mg per square meter every 12 hours [DOSAGE ERROR CORRECTED] for 6 days. The primary end point was event-free survival. RESULTS: The complete remission rates were 64% in the group that received the escalated dose of daunorubicin and 54% in the group that received the conventional dose (P=0.002); the rates of remission after the first cycle of induction treatment were 52% and 35%, respectively (P<0.001). There was no significant difference between the two groups in the incidence of hematologic toxic effects, 30-day mortality (11% and 12% in the two groups, respectively), or the incidence of moderate, severe, or life-threatening adverse events (P=0.08). Survival end points in the two groups did not differ significantly overall, but patients in the escalated-treatment group who were 60 to 65 years of age, as compared with the patients in the same age group who received the conventional dose, had higher rates of complete remission (73% vs. 51%), event-free survival (29% vs. 14%), and overall survival (38% vs. 23%). CONCLUSIONS: In patients with AML who are older than 60 years of age, escalation of the dose of daunorubicin to twice the conventional dose, with the entire dose administered in the first induction cycle, effects a more rapid response and a higher response rate than does the conventional dose, without additional toxic effects. (Current Controlled Trials number, ISRCTN77039377; and Netherlands National Trial Register number, NTR212.)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVES: Ventilated preterm infants are at high risk for procedural pain exposure. In Switzerland there is a lack of knowledge about the pain management in this highly vulnerable patient population. The aims of this study were to describe the type and frequency of procedures and to determine the amount of analgesia given to this patient group in two Swiss neonatal intensive care units. METHOD: A retrospective cohort study was performed examining procedural exposure and pain management of a convenience sample of 120 ventilated preterm infants (mean age = 29.7 weeks of gestation) during the first 14 days of life after delivery and born between May 1st 2004 and March 31st 2006. RESULTS: The total number of procedures all the infants underwent was 38,626 indicating a mean of 22.9 general procedures performed per child and day. Overall, 75.6% of these procedures are considered to be painful. The most frequently performed procedure is manipulation on the CPAP prongs. Pain measurements were performed four to seven times per day. In all, 99.2% of the infants received either non-pharmacological and/or pharmacological agents and 70.8% received orally administered glucose as pre-emptive analgesia. Morphine was the most commonly used pharmacological agent. DISCUSSION: The number of procedures ventilated preterm infants are exposed to is disconcerting. Iatrogenic pain is a serious problem, particularly in preterm infants of low gestational age. The fact that nurses assessed pain on average four to seven times daily per infant indicates a commitment to exploring a painful state in a highly vulnerable patient population. In general, pharmacological pain management and the administration of oral glucose as a non-pharmacological pain relieving intervention appear to be adequate, but there may be deficiencies, particularly for extremely low birth weight infants born <28 weeks of gestation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND:Accurate quantification of the prevalence of human immunodeficiency virus type 1 (HIV-1) drug resistance in patients who are receiving antiretroviral therapy (ART) is difficult, and results from previous studies vary. We attempted to assess the prevalence and dynamics of resistance in a highly representative patient cohort from Switzerland. METHODS:On the basis of genotypic resistance test results and clinical data, we grouped patients according to their risk of harboring resistant viruses. Estimates of resistance prevalence were calculated on the basis of either the proportion of individuals with a virologic failure or confirmed drug resistance (lower estimate) or the frequency-weighted average of risk group-specific probabilities for the presence of drug resistance mutations (upper estimate). RESULTS:Lower and upper estimates of drug resistance prevalence in 8064 ART-exposed patients were 50% and 57% in 1999 and 37% and 45% in 2007, respectively. This decrease was driven by 2 mechanisms: loss to follow-up or death of high-risk patients exposed to mono- or dual-nucleoside reverse-transcriptase inhibitor therapy (lower estimates range from 72% to 75%) and continued enrollment of low-risk patients who were taking combination ART containing boosted protease inhibitors or nonnucleoside reverse-transcriptase inhibitors as first-line therapy (lower estimates range from 7% to 12%). A subset of 4184 participants (52%) had >or= 1 study visit per year during 2002-2007. In this subset, lower and upper estimates increased from 45% to 49% and from 52% to 55%, respectively. Yearly increases in prevalence were becoming smaller in later years. CONCLUSIONS:Contrary to earlier predictions, in situations of free access to drugs, close monitoring, and rapid introduction of new potent therapies, the emergence of drug-resistant viruses can be minimized at the population level. Moreover, this study demonstrates the necessity of interpreting time trends in the context of evolving cohort populations.