961 resultados para Evaluate Risk
Resumo:
Background Left atrium (LA) dilation and P-wave duration are linked to the amount of endurance training and are risk factors for atrial fibrillation (AF). The aim of this study was to evaluate the impact of LA anatomical and electrical remodeling on its conduit and pump function measured by two-dimensional speckle tracking echocardiography (STE). Method Amateur male runners > 30 years were recruited. Study participants (n = 95) were stratified in 3 groups according to lifetime training hours: low (< 1500 h, n = 33), intermediate (1500 to 4500 h, n = 32) and high training group (> 4500 h, n = 30). Results No differences were found, between the groups, in terms of age, blood pressure, and diastolic function. LA maximal volume (30 ± 5, 33 ± 5 vs. 37 ± 6 ml/m2, p < 0.001), and conduit volume index (9 ± 3, 11 ± 3 vs. 12 ± 3 ml/m2, p < 0.001) increased significantly from the low to the high training group, unlike the STE parameters: pump strain − 15.0 ± 2.8, − 14.7 ± 2.7 vs. − 14.9 ± 2.6%, p = 0.927; conduit strain 23.3 ± 3.9, 22.1 ± 5.3 vs. 23.7 ± 5.7%, p = 0.455. Independent predictors of LA strain conduit function were age, maximal early diastolic velocity of the mitral annulus, heart rate and peak early diastolic filling velocity. The signal-averaged P-wave (135 ± 11, 139 ± 10 vs. 148 ± 14 ms, p < 0.001) increased from the low to the high training group. Four episodes of non-sustained AF were recorded in one runner of the high training group. Conclusion The LA anatomical and electrical remodeling does not have a negative impact on atrial mechanical function. Hence, a possible link between these risk factors for AF and its actual, rare occurrence in this athlete population, could not be uncovered in the present study.
Resumo:
Calf losses (CL, mortality and unwanted early slaughter) in veal production are of great economic importance and an indicator of welfare. The objective of the present study was to evaluate CL and the causes of death on farms with a specific animal welfare standard (SAW) which exceeds the Swiss statutory regulations. Risk factors for CL were identified based on information about management, housing, feeding, and medication. In total, 74 production cohorts (2783 calves) of 15 farms were investigated. CL was 3.6%, the main causes of death were digestive disorders (52%), followed by respiratory diseases (28%). Factors significantly associated with an increased risk for CL were a higher number of individual daily doses of antibiotics (DDA), insufficient wind deflection in winter, and male gender. For administration of antibiotics to all calves of the cohort, a DDA of 14-21 was associated with a decreased risk for CL compared to a DDA of 7-13 days.
Resumo:
BACKGROUND AND AIMS Hypoxia can induce inflammation in the gastrointestinal tract. However, the impact of hypoxia on the course of inflammatory bowel disease (IBD) is poorly understood. We aimed to evaluate whether flights and/or journeys to regions lying at an altitude of >2000m above the sea level are associated with flare-ups within 4weeks of the trip. METHODS IBD patients with at least one flare-up during a 12-month observation period were compared to a group of patients in remission. Both groups completed a questionnaire. RESULTS A total of 103 IBD patients were included (43 with Crohn's disease (CD): mean age 39.3±14.6years; 60 with ulcerative colitis (UC): mean age 40.4±15.1years). Fifty-two patients with flare-ups were matched to 51 patients in remission. IBD patients experiencing flare-ups had more frequently undertaken flights and/or journeys to regions >2000m above sea level within four weeks of the flare-up when compared to patients in remission (21/52 [40.4%] vs. 8/51 [15.7%], p=0.005). CONCLUSIONS Journeys to high altitude regions and/or flights are a risk factor for IBD flare-ups occurring within 4weeks of travel.
Resumo:
INTRODUCTION Current literature suggesting a higher bleeding risk during combination therapy compared to oral anticoagulation alone is primarily based on retrospective studies or specific populations. We aimed to prospectively evaluate whether unselected medical patients on oral anticoagulation have an increased risk of bleeding when on concomitant antiplatelet therapy. MATERIAL AND METHODS We prospectively studied consecutive adult medical patients who were discharged on oral anticoagulants between 01/2008 and 03/2009 from a Swiss university hospital. The primary outcome was the time to a first major bleed on oral anticoagulation within 12 months, adjusted for age, international normalized ratio target, number of medications, and history of myocardial infarction and major bleeding. RESULTS Among the 515 included anticoagulated patients, the incidence rate of a first major bleed was 8.2 per 100 patient-years. Overall, 161 patients (31.3%) were on both anticoagulant and antiplatelet therapy, and these patients had a similar incidence rate of major bleeding compared to patients on oral anticoagulation alone (7.6 vs. 8.4 per 100 patient-years, P=0.81). In a multivariate analysis, the association of concomitant antiplatelet therapy with the risk of major bleeding was not statistically significant (hazard ratio 0.89, 95% confidence interval, 0.37-2.10). CONCLUSIONS The risk of bleeding in patients receiving oral anticoagulants combined with antiplatelet therapy was similar to patients receiving oral anticoagulants alone, suggesting that the incremental bleeding risk of combination therapy might not be clinically significant.
Resumo:
Bovine tuberculosis (bTB) caused by Mycobacterium bovis or M. caprae has recently (re-) emerged in livestock and wildlife in all countries bordering Switzerland (CH) and the Principality of Liechtenstein (FL). Comprehensive data for Swiss and Liechtenstein wildlife are not available so far, although two native species, wild boar (Sus scrofa) and red deer (Cervus elaphus elaphus), act as bTB reservoirs elsewhere in continental Europe. Our aims were (1) to assess the occurrence of bTB in these wild ungulates in CH/FL and to reinforce scanning surveillance in all wild mammals; (2) to evaluate the risk of a future bTB reservoir formation in wild boar and red deer in CH/FL. Tissue samples collected from 2009 to 2011 from 434 hunted red deer and wild boar and from eight diseased ungulates with tuberculosis-like lesions were tested by direct real-time PCR and culture to detect mycobacteria of the Mycobacterium tuberculosis complex (MTBC). Identification of suspicious colonies was attempted by real-time PCR, genotyping and spoligotyping. Information on risk factors for bTB maintenance within wildlife populations was retrieved from the literature and the situation regarding identified factors was assessed for our study areas. Mycobacteria of the MTBC were detected in six out of 165 wild boar (3.6%; 95% CI: 1.4-7.8) but none of the 269 red deer (0%; 0-1.4). M. microti was identified in two MTBC-positive wild boar, while species identification remained unsuccessful in four cases. Main risk factors for bTB maintenance worldwide, including different causes of aggregation often resulting from intensive wildlife management, are largely absent in CH and FL. In conclusion, M. bovis and M. caprae were not detected but we report for the first time MTBC mycobacteria in Swiss wild boar. Present conditions seem unfavorable for a reservoir emergence, nevertheless increasing population numbers of wild ungulates and offal consumption may represent a risk.
Resumo:
Cotrimoxazole reduces mortality in HIV-infected adults with tuberculosis (TB), and in vitro data suggest potential anti-mycobacterial activity of cotrimoxazole. We aimed to evaluate whether prophylaxis with cotrimoxazole is associated with a decreased risk of incident TB in SHCS participants. We determined the incidence of TB per 1000 person-years from January 1992 to December 2012. Rates were analyzed separately in participants with current or no previous antiretroviral treatment (ART) using Poisson regression adjusted for CD4 cell count, sex, region of origin, injecting drug use, and age. 13,431 cohort participants contributed 107,549 person-years follow-up; 182 patients had incident TB; 132 (73%) before and 50 (27%) after ART initiation. The multivariable incidence rate ratios for cumulative cotrimoxazole exposure per year for persons with no previous and current ART were 0.70 (95% CI 0.55-0.89) and 0.87 (0.74-1.0) respectively. Cotrimoxazole may prevent the development of TB among HIV-positive persons, especially among those with no previous ART.
Resumo:
BACKGROUND Empirical research has illustrated an association between study size and relative treatment effects, but conclusions have been inconsistent about the association of study size with the risk of bias items. Small studies give generally imprecisely estimated treatment effects, and study variance can serve as a surrogate for study size. METHODS We conducted a network meta-epidemiological study analyzing 32 networks including 613 randomized controlled trials, and used Bayesian network meta-analysis and meta-regression models to evaluate the impact of trial characteristics and study variance on the results of network meta-analysis. We examined changes in relative effects and between-studies variation in network meta-regression models as a function of the variance of the observed effect size and indicators for the adequacy of each risk of bias item. Adjustment was performed both within and across networks, allowing for between-networks variability. RESULTS Imprecise studies with large variances tended to exaggerate the effects of the active or new intervention in the majority of networks, with a ratio of odds ratios of 1.83 (95% CI: 1.09,3.32). Inappropriate or unclear conduct of random sequence generation and allocation concealment, as well as lack of blinding of patients and outcome assessors, did not materially impact on the summary results. Imprecise studies also appeared to be more prone to inadequate conduct. CONCLUSIONS Compared to more precise studies, studies with large variance may give substantially different answers that alter the results of network meta-analyses for dichotomous outcomes.
Resumo:
PRINCIPLES To evaluate the validity and feasibility of a novel photography-based home assessment (PhoHA) protocol, as a possible substitute for on-site home assessment (OsHA). METHODS A total of 20 patients aged ≥65 years who were hospitalised in a rehabilitation centre for musculoskeletal disorders affecting mobility participated in this prospective validation study. For PhoHA, occupational therapists rated photographs and measurements of patients' homes provided by patients' confidants. For OsHA, occupational therapists conducted a conventional home visit. RESULTS Information obtained by PhoHA was 79.1% complete (1,120 environmental factors identified by PhoHA vs 1416 by OsHA). Of the 1,120 factors, 749 had dichotomous (potential hazards) and 371 continuous scores (measurements with tape measure). Validity of PhoHA to potential hazards was good (sensitivity 78.9%, specificity 84.9%), except for two subdomains (pathways, slippery surfaces). Pearson's correlation coefficient for the validity of measurements was 0.87 (95% confidence interval [CI 0.80-0.92, p <0.001). Agreement between methods was 0.52 (95%CI 0.34-0.67, p <0.001, Cohen's kappa coefficient) for dichotomous and 0.86 (95%CI 0.79-0.91, p <0.001, intraclass correlation coefficient) for continuous scores. Costs of PhoHA were 53.0% lower than those of OsHA (p <0.001). CONCLUSIONS PhoHA has good concurrent validity for environmental assessment if instructions for confidants are improved. PhoHA is potentially a cost-effective method for environmental assessment.
Resumo:
The purpose of this study was to determine the perception and knowledge of targeted ultrasound in women who screen positive for Down syndrome in the first or second trimester, and to assess the perceived detection rate of Down syndrome by targeted ultrasound in this population. While several studies have reported patient perceptions’ of routine ultrasound, no study has specifically examined knowledge regarding the targeted ultrasound and its role in detecting Down syndrome. A targeted ultrasound is a special ultrasound during the second trimester offered to women who may be at a higher-than-average risk of having a baby with some type of birth defect or complication. The purpose of the ultrasound is to evaluate the overall growth and development of the baby as well as screen for birth defects and genetic conditions. Women under the age of 35 referred for an abnormal first or second trimester maternal serum screen to several Houston area clinics were asked to complete a questionnaire to obtain demographic and ultrasound knowledge information as well as assess perceived detection rate of Down syndrome by ultrasound. Seventy-seven women completed the questionnaire and participated in the study. Our findings revealed that women have limited background knowledge about the targeted ultrasound and its role in detecting Down syndrome. These findings are consistent with other studies that have reported a lack of understanding about the purpose of ultrasound examinations. One factor that seems to increase background knowledge about the targeted ultrasound is individuals having a higher level of education. However, most participants regardless of race, education, income, and exposure to targeted ultrasound information did not know the capabilities of a targeted ultrasound. This study confirmed women lack background knowledge about the targeted ultrasound and do not know enough about the technology to form a perception regarding its ability to detect Down syndrome. Additional studies to identify appropriate education techniques are necessary to determine how to best inform our patient population about targeted ultrasound.
Resumo:
OBJECTIVE: We sought to evaluate the performance of the human papillomavirus high-risk DNA test in patients 30 years and older. MATERIALS AND METHODS: Screening (n=835) and diagnosis (n=518) groups were defined based on prior Papanicolaou smear results as part of a clinical trial for cervical cancer detection. We compared the Hybrid Capture II (HCII) test result with the worst histologic report. We used cervical intraepithelial neoplasia (CIN) 2/3 or worse as the reference of disease. We calculated sensitivities, specificities, positive and negative likelihood ratios (LR+ and LR-), receiver operating characteristic (ROC) curves, and areas under the ROC curves for the HCII test. We also considered alternative strategies, including Papanicolaou smear, a combination of Papanicolaou smear and the HCII test, a sequence of Papanicolaou smear followed by the HCII test, and a sequence of the HCII test followed by Papanicolaou smear. RESULTS: For the screening group, the sensitivity was 0.69 and the specificity was 0.93; the area under the ROC curve was 0.81. The LR+ and LR- were 10.24 and 0.34, respectively. For the diagnosis group, the sensitivity was 0.88 and the specificity was 0.78; the area under the ROC curve was 0.83. The LR+ and LR- were 4.06 and 0.14, respectively. Sequential testing showed little or no improvement over the combination testing. CONCLUSIONS: The HCII test in the screening group had a greater LR+ for the detection of CIN 2/3 or worse. HCII testing may be an additional screening tool for cervical cancer in women 30 years and older.
Resumo:
OBJECTIVE: The objective of this study was to evaluate the impact of newer therapies on the highest risk patients with congenital diaphragmatic hernia (CDH), those with agenesis of the diaphragm. SUMMARY BACKGROUND DATA: CDH remains a significant cause of neonatal mortality. Many novel therapeutic interventions have been used in these infants. Those children with large defects or agenesis of the diaphragm have the highest mortality and morbidity. METHODS: Twenty centers from 5 countries collected data prospectively on all liveborn infants with CDH over a 10-year period. The treatment and outcomes in these patients were examined. Patients were followed until death or hospital discharge. RESULTS: A total of 1,569 patients with CDH were seen between January 1995 and December 2004 in 20 centers. A total of 218 patients (14%) had diaphragmatic agenesis and underwent repair. The overall survival for all patients was 68%, while survival was 54% in patients with agenesis. When patients with diaphragmatic agenesis from the first 2 years were compared with similar patients from the last 2 years, there was significantly less use of ECMO (75% vs. 52%) and an increased use of inhaled nitric oxide (iNO) (30% vs. 80%). There was a trend toward improved survival in patients with agenesis from 47% in the first 2 years to 59% in the last 2 years. The survivors with diaphragmatic agenesis had prolonged hospital stays compared with patients without agenesis (median, 68 vs. 30 days). For the last 2 years of the study, 36% of the patients with agenesis were discharged on tube feedings and 22% on oxygen therapy. CONCLUSIONS: There has been a change in the management of infants with CDH with less frequent use of ECMO and a greater use of iNO in high-risk patients with a potential improvement in survival. However, the mortality, hospital length of stay, and morbidity in agenesis patients remain significant.
Resumo:
A cohort of 418 United States Air Force (USAF) personnel from over 15 different bases deployed to Morocco in 1994. This was the first study of its kind and was designed with two primary goals: to determine if the USAF was medically prepared to deploy with its changing mission in the new world order, and to evaluate factors that might improve or degrade USAF medical readiness. The mean length of deployment was 21 days. The cohort was 95% male, 86% enlisted, 65% married, and 78% white.^ This study shows major deficiencies indicating the USAF medical readiness posture has not fully responded to meet its new mission requirements. Lack of required logistical items (e.g., mosquito nets, rainboots, DEET insecticide cream, etc.) revealed a low state of preparedness. The most notable deficiency was that 82.5% (95% CI = 78.4, 85.9) did not have permethrin pretreated mosquito nets and 81.0% (95% CI = 76.8, 84.6) lacked mosquito net poles. Additionally, 18% were deficient on vaccinations and 36% had not received a tuberculin skin test. Excluding injections, the overall compliance for preventive medicine requirements had a mean frequency of only 50.6% (95% CI = 45.36, 55.90).^ Several factors had a positive impact on compliance with logistical requirements. The most prominent was "receiving a medical intelligence briefing" from the USAF Public Health. After adjustment for mobility and age, individuals who underwent a briefing were 17.2 (95% CI = 4.37, 67.99) times more likely to have received an immunoglobulin shot and 4.2 (95% CI = 1.84, 9.45) times more likely to start their antimalarial prophylaxsis at the proper time. "Personnel on mobility" had the second strongest positive effect on medical readiness. When mobility and briefing were included in models, "personnel on mobility" were 2.6 (95% CI = 1.19, 5.53) times as likely to have DEET insecticide and 2.2 (95% CI = 1.16, 4.16) times as likely to have had a TB skin test.^ Five recommendations to improve the medical readiness of the USAF were outlined: upgrade base level logistical support, improve medical intelligence messages, include medical requirements on travel orders, place more personnel on mobility or only deploy personnel on mobility, and conduct research dedicated to capitalize on the powerful effect from predeployment briefings.^ Since this is the first study of its kind, more studies should be performed in different geographic theaters to assess medical readiness and establish acceptable compliance levels for the USAF. ^
Resumo:
In personal and in society related context, people often evaluate the risk of environmental and technological hazards. Previous research addressing neuroscience of risk evaluation assessed particularly the direct personal risk of presented stimuli, which may have comprised for instance aspects of fear. Further, risk evaluation primarily was compared to tasks of other cognitive domains serving as control conditions, thus revealing general risk related brain activity, but not such specifically associated with estimating a higher level of risk. We here investigated the neural basis on which lay-persons individually evaluated the risk of different potential hazards for the society. Twenty healthy subjects underwent functional magnetic resonance imaging while evaluating the risk of fifty more or less risky conditions presented as written terms. Brain activations during the individual estimations of 'high' against 'low' risk, and of negative versus neutral and positive emotional valences were analyzed. Estimating hazards to be of high risk was associated with activation in medial thalamus, anterior insula, caudate nucleus, cingulate cortex and further prefrontal and temporo-occipital areas. These areas were not involved according to an analysis of the emotion ratings. In conclusion, we emphasize a contribution of the mentioned brain areas involved to signal high risk, here not primarily associated with the emotional valence of the risk items. These areas have earlier been reported to be associated with, beside emotional, viscerosensitive and implicit processing. This leads to assumptions of an intuitive contribution, or a "gut-feeling", not necessarily dependent of the subjective emotional valence, when estimating a high risk of environmental hazards.
Resumo:
Poor udder health represents a serious problem in dairy production and has been investigated intensively, but heifers generally have not been the main focus of mastitis control. The aim of this study was to evaluate the prevalence, risk factors and consequences of heifer mastitis in Switzerland. The study included 166,518 heifers of different breeds (Swiss Red Pied, Swiss Brown Cattle and Holstein). Monthly somatic cell counts (SCCs) provided by the main dairy breeding organisations in Switzerland were monitored for 3 years; the prevalence of subclinical mastitis (SCM) was determined on the basis of SCCs ≥100,000 cells/mL at the first test date. The probability of having SCM at the first test date during lactation was modelled using logistic regression. Analysed factors included data for the genetic background, morphological traits, geographical region, season of parturition and milk composition. The overall prevalence of SCM in heifers during the period from 2006 to 2010 was 20.6%. Higher frequencies of SCM were present in heifers of the Holstein breed (odds ratio, OR, 1.62), heifers with high fat:protein ratios (OR 1.97) and heifers with low milk urea concentrations combined with high milk protein concentrations (OR 3.97). Traits associated with a low risk of SCM were high set udders, high overall breeding values and low milk breeding values. Heifers with SCM on the first test day had a higher risk of either developing chronic mastitis or leaving the herd prematurely.
Resumo:
This study aims to evaluate the potential for impacts of ocean acidification on North Atlantic deep-sea ecosystems in response to IPCC AR5 Representative Concentration Pathways (RCPs). Deep-sea biota is likely highly vulnerable to changes in seawater chemistry and sensitive to moderate excursions in pH. Here we show, from seven fully coupled Earth system models, that for three out of four RCPs over 17% of the seafloor area below 500 m depth in the North Atlantic sector will experience pH reductions exceeding −0.2 units by 2100. Increased stratification in response to climate change partially alleviates the impact of ocean acidification on deep benthic environments. We report on major pH reductions over the deep North Atlantic seafloor (depth >500 m) and at important deep-sea features, such as seamounts and canyons. By 2100, and under the high CO2 scenario RCP8.5, pH reductions exceeding −0.2 (−0.3) units are projected in close to 23% (~15%) of North Atlantic deep-sea canyons and ~8% (3%) of seamounts – including seamounts proposed as sites of marine protected areas. The spatial pattern of impacts reflects the depth of the pH perturbation and does not scale linearly with atmospheric CO2 concentration. Impacts may cause negative changes of the same magnitude or exceeding the current target of 10% of preservation of marine biomes set by the convention on biological diversity, implying that ocean acidification may offset benefits from conservation/management strategies relying on the regulation of resource exploitation.