948 resultados para Multicast Application Level
Resumo:
AIMS To investigate and quantify the clinical benefits of early versus delayed application of Thomas splints in patients with isolated femur shaft fractures. MATERIALS AND METHODS Level IV retrospective clinical and radiological analysis of patients presenting from January to December 2012 at a Level 1 Trauma Unit. All skeletally mature patients with isolated femur shaft fractures independently of their mechanism of injury were included. Exclusion criteria were: ipsilateral fracture of the lower limb, neck and supracondylar femur fractures, periprosthetic and incomplete fractures. Their clinical records were analysed for blood transfusion requirements, pulmonary complications, surgery time, duration of hospital stay and analgesic requirements. RESULTS A total of 106 patients met our inclusion criteria. There were 74 males and 32 females. Fifty seven (54%) patients were in the 'early splinted' group and 49 patients (46%) were in the 'delayed splinted' group (P>0.05). The need for blood transfusion was significantly reduced in the 'early splinted' group (P=0.04). There was a significantly higher rate of pulmonary complications in the 'delayed splinted' group (P=0.008). All other parameters were similar between the two groups. CONCLUSION The early application of Thomas splints for isolated femur fractures in non-polytraumatised patients has a clinically and statistically significant benefit of reducing the need for blood transfusions and the incidence of pulmonary complications.
Resumo:
Soils provide us with over 90% of all human food, livestock feed, fibre and fuel on Earth. Soils, however, have more than just productive functions. The key challenge in coming years will be to address the diverse and potentially conflicting demands now being made by human societies and other forms of life, while ensuring that future generations have the same potential to use soils and land of comparable quality. In a multi-level stakeholder approach, down-to-earth action will have to be supplemented with measures at various levels, from households to communities, and from national policies to international conventions. Knowledge systems, both indigenous and scientific, and related research and learning processes must play a central role. Ongoing action can be enhanced through a critical assessment of the impact of past achievements, and through better cooperation between people and institutions.
Resumo:
A 37-year-old man presented with a 4-day history of nonbloody diarrhea, fever, chills, productive cough, vomiting, and more recent sore throat. He worked for the municipality in a village in the Swiss Alps near St. Moritz. Examination showed fever (40 °C), hypotension, tachycardia, tachypnea, decreased oxygen saturation (90 % at room air), and bibasilar crackles and wheezing. Chest radiography and computed tomography scan showed an infiltrate in the left upper lung lobe. He responded to empiric therapy with imipenem for 5 days. After the imipenem was stopped, the bacteriology laboratory reported that 2/2 blood cultures showed growth of Francisella tularensis. He had recurrence of fever and diarrhea. He was treated with ciprofloxacin (500 mg twice daily, oral, for 14 days) and symptoms resolved. Further testing confirmed that the isolate was F. tularensis (subspecies holarctica) belonging to the subclade B.FTNF002-00 (Western European cluster). This case may alert physicians that tularemia may occur in high-altitude regions such as the Swiss Alps.
Resumo:
OBJECTIVES This study sought to evaluate: 1) the effect of impaired renal function on long-term clinical outcomes in women undergoing percutaneous coronary intervention (PCI) with drug-eluting stent (DES); and 2) the safety and efficacy of new-generation compared with early-generation DES in women with chronic kidney disease (CKD). BACKGROUND The prevalence and effect of CKD in women undergoing PCI with DES is unclear. METHODS We pooled patient-level data for women enrolled in 26 randomized trials. The study population was categorized by creatinine clearance (CrCl) <45 ml/min, 45 to 59 ml/min, and ≥60 ml/min. The primary endpoint was the 3-year rate of major adverse cardiovascular events (MACE). Participants for whom baseline creatinine was missing were excluded from the analysis. RESULTS Of 4,217 women included in the pooled cohort treated with DES and for whom serum creatinine was available, 603 (14%) had a CrCl <45 ml/min, 811 (19%) had a CrCl 45 to 59 ml/min, and 2,803 (66%) had a CrCl ≥60 ml/min. A significant stepwise gradient in risk for MACE was observed with worsening renal function (26.6% vs. 15.8% vs. 12.9%; p < 0.01). Following multivariable adjustment, CrCl <45 ml/min was independently associated with a higher risk of MACE (adjusted hazard ratio: 1.56; 95% confidence interval: 1.23 to 1.98) and all-cause mortality (adjusted hazard ratio: 2.67; 95% confidence interval: 1.85 to 3.85). Compared with older-generation DES, the use of newer-generation DES was associated with a reduction in the risk of cardiac death, myocardial infarction, or stent thrombosis in women with CKD. The effect of new-generation DES on outcomes was uniform, between women with or without CKD, without evidence of interaction. CONCLUSIONS Among women undergoing PCI with DES, CKD is a common comorbidity associated with a strong and independent risk for MACE that is durable over 3 years. The benefits of newer-generation DES are uniform in women with or without CKD.
Resumo:
BACKGROUND The safety and efficacy of new-generation drug-eluting stents (DES) in women with multiple atherothrombotic risk (ATR) factors is unclear. METHODS AND RESULTS We pooled patient-level data for women enrolled in 26 randomized trials. Study population was categorized based on the presence or absence of high ATR, which was defined as having history of diabetes mellitus, prior percutaneous or surgical coronary revascularization, or prior myocardial infarction. The primary end point was major adverse cardiovascular events defined as a composite of all-cause mortality, myocardial infarction, or target lesion revascularization at 3 years of follow-up. Out of 10 449 women included in the pooled database, 5333 (51%) were at high ATR. Compared with women not at high ATR, those at high ATR had significantly higher risk of major adverse cardiovascular events (15.8% versus 10.6%; adjusted hazard ratio: 1.53; 95% confidence interval: 1.34-1.75; P=0.006) and all-cause mortality. In high-ATR risk women, the use of new-generation DES was associated with significantly lower risk of 3-year major adverse cardiovascular events (adjusted hazard ratio: 0.69; 95% confidence interval: 0.52-0.92) compared with early-generation DES. The benefit of new-generation DES on major adverse cardiovascular events was uniform between high-ATR and non-high-ATR women, without evidence of interaction (Pinteraction=0.14). At landmark analysis, in high-ATR women, stent thrombosis rates were comparable between DES generations in the first year, whereas between 1 and 3 years, stent thrombosis risk was lower with new-generation devices. CONCLUSIONS Use of new-generation DES even in women at high ATR is associated with a benefit consistent over 3 years of follow-up and a substantial improvement in very-late thrombotic safety.
Resumo:
This paper describes methods and results for the annotation of two discourse-level phenomena, connectives and pronouns, over a multilingual parallel corpus. Excerpts from Europarl in English and French have been annotated with disambiguation information for connectives and pronouns, for about 3600 tokens. This data is then used in several ways: for cross-linguistic studies, for training automatic disambiguation software, and ultimately for training and testing discourse-aware statistical machine translation systems. The paper presents the annotation procedures and their results in detail, and overviews the first systems trained on the annotated resources and their use for machine translation.
Resumo:
BACKGROUND Diabetes mellitus and angiographic coronary artery disease complexity are intertwined and unfavorably affect prognosis after percutaneous coronary interventions, but their relative impact on long-term outcomes after percutaneous coronary intervention with drug-eluting stents remains controversial. This study determined drug-eluting stents outcomes in relation to diabetic status and coronary artery disease complexity as assessed by the Synergy Between PCI With Taxus and Cardiac Surgery (SYNTAX) score. METHODS AND RESULTS In a patient-level pooled analysis from 4 all-comers trials, 6081 patients were stratified according to diabetic status and according to the median SYNTAX score ≤11 or >11. The primary end point was major adverse cardiac events, a composite of cardiac death, myocardial infarction, and clinically indicated target lesion revascularization within 2 years. Diabetes mellitus was present in 1310 patients (22%), and new-generation drug-eluting stents were used in 4554 patients (75%). Major adverse cardiac events occurred in 173 diabetics (14.5%) and 436 nondiabetic patients (9.9%; P<0.001). In adjusted Cox regression analyses, SYNTAX score and diabetes mellitus were both associated with the primary end point (P<0.001 and P=0.028, respectively; P for interaction, 0.07). In multivariable analyses, diabetic versus nondiabetic patients had higher risks of major adverse cardiac events (hazard ratio, 1.25; 95% confidence interval, 1.03-1.53; P=0.026) and target lesion revascularization (hazard ratio, 1.54; 95% confidence interval, 1.18-2.01; P=0.002) but similar risks of cardiac death (hazard ratio, 1.41; 95% confidence interval, 0.96-2.07; P=0.08) and myocardial infarction (hazard ratio, 0.89; 95% confidence interval, 0.64-1.22; P=0.45), without significant interaction with SYNTAX score ≤11 or >11 for any of the end points. CONCLUSIONS In this population treated with predominantly new-generation drug-eluting stents, diabetic patients were at increased risk for repeat target-lesion revascularization consistently across the spectrum of disease complexity. The SYNTAX score was an independent predictor of 2-year outcomes but did not modify the respective effect of diabetes mellitus. CLINICAL TRIAL REGISTRATION URL: http://www.clinicaltrials.gov. Unique identifiers: NCT00297661, NCT00389220, NCT00617084, and NCT01443104.
Resumo:
Research question: International and national sport federations as well as their member organisations are key actors within the sport system and have a wide range of relationships outside the sport system (e.g. with the state, sponsors, and the media). They are currently facing major challenges such as growing competition in top-level sports, democratisation of sports with ‘sports for all’ and sports as the answer to social problems. In this context, professionalising sport organisations seems to be an appropriate strategy to face these challenges and current problems. We define the professionalisation of sport organisations as an organisational process of transformation leading towards organisational rationalisation, efficiency and business-like management. This has led to a profound organisational change, particularly within sport federations, characterised by the strengthening of institutional management (managerialism) and the implementation of efficiency-based management instruments and paid staff. Research methods: The goal of this article is to review the current international literature and establish a global understanding of and theoretical framework for analysing why and how sport organisations professionalise and what consequences this may have. Results and findings: Our multi-level approach based on the social theory of action integrates the current concepts for analysing professionalisation in sport federations. We specify the framework for the following research perspectives: (1) forms, (2) causes and (3) consequences, and discuss the reciprocal relations between sport federations and their member organisations in this context. Implications: Finally, we work out a research agenda and derive general methodological consequences for the investigation of professionalisation processes in sport organisations.
Resumo:
Systematic consideration of scientific support is a critical element in developing and, ultimately, using adverse outcome pathways (AOPs) for various regulatory applications. Though weight of evidence (WoE) analysis has been proposed as a basis for assessment of the maturity and level of confidence in an AOP, methodologies and tools are still being formalized. The Organization for Economic Co-operation and Development (OECD) Users' Handbook Supplement to the Guidance Document for Developing and Assessing AOPs (OECD 2014a; hereafter referred to as the OECD AOP Handbook) provides tailored Bradford-Hill (BH) considerations for systematic assessment of confidence in a given AOP. These considerations include (1) biological plausibility and (2) empirical support (dose-response, temporality, and incidence) for Key Event Relationships (KERs), and (3) essentiality of key events (KEs). Here, we test the application of these tailored BH considerations and the guidance outlined in the OECD AOP Handbook using a number of case examples to increase experience in more transparently documenting rationales for assigned levels of confidence to KEs and KERs, and to promote consistency in evaluation within and across AOPs. The major lessons learned from experience are documented, and taken together with the case examples, should contribute to better common understanding of the nature and form of documentation required to increase confidence in the application of AOPs for specific uses. Based on the tailored BH considerations and defining questions, a prototype quantitative model for assessing the WoE of an AOP using tools of multi-criteria decision analysis (MCDA) is described. The applicability of the approach is also demonstrated using the case example aromatase inhibition leading to reproductive dysfunction in fish. Following the acquisition of additional experience in the development and assessment of AOPs, further refinement of parameterization of the model through expert elicitation is recommended. Overall, the application of quantitative WoE approaches hold promise to enhance the rigor, transparency and reproducibility for AOP WoE determinations and may play an important role in delineating areas where research would have the greatest impact on improving the overall confidence in the AOP.
Resumo:
BACKGROUND In an effort to reduce firearm mortality rates in the USA, US states have enacted a range of firearm laws to either strengthen or deregulate the existing main federal gun control law, the Brady Law. We set out to determine the independent association of different firearm laws with overall firearm mortality, homicide firearm mortality, and suicide firearm mortality across all US states. We also projected the potential reduction of firearm mortality if the three most strongly associated firearm laws were enacted at the federal level. METHODS We constructed a cross-sectional, state-level dataset from Nov 1, 2014, to May 15, 2015, using counts of firearm-related deaths in each US state for the years 2008-10 (stratified by intent [homicide and suicide]) from the US Centers for Disease Control and Prevention's Web-based Injury Statistics Query and Reporting System, data about 25 firearm state laws implemented in 2009, and state-specific characteristics such as firearm ownership for 2013, firearm export rates, and non-firearm homicide rates for 2009, and unemployment rates for 2010. Our primary outcome measure was overall firearm-related mortality per 100 000 people in the USA in 2010. We used Poisson regression with robust variances to derive incidence rate ratios (IRRs) and 95% CIs. FINDINGS 31 672 firearm-related deaths occurred in 2010 in the USA (10·1 per 100 000 people; mean state-specific count 631·5 [SD 629·1]). Of 25 firearm laws, nine were associated with reduced firearm mortality, nine were associated with increased firearm mortality, and seven had an inconclusive association. After adjustment for relevant covariates, the three state laws most strongly associated with reduced overall firearm mortality were universal background checks for firearm purchase (multivariable IRR 0·39 [95% CI 0·23-0·67]; p=0·001), ammunition background checks (0·18 [0·09-0·36]; p<0·0001), and identification requirement for firearms (0·16 [0·09-0·29]; p<0·0001). Projected federal-level implementation of universal background checks for firearm purchase could reduce national firearm mortality from 10·35 to 4·46 deaths per 100 000 people, background checks for ammunition purchase could reduce it to 1·99 per 100 000, and firearm identification to 1·81 per 100 000. INTERPRETATION Very few of the existing state-specific firearm laws are associated with reduced firearm mortality, and this evidence underscores the importance of focusing on relevant and effective firearms legislation. Implementation of universal background checks for the purchase of firearms or ammunition, and firearm identification nationally could substantially reduce firearm mortality in the USA. FUNDING None.
Resumo:
Purpose To determine renal oxygenation changes associated with uninephrectomy and transplantation in both native donor kidneys and transplanted kidneys by using blood oxygenation level-dependent (BOLD) MR imaging. Materials and Methods The study protocol was approved by the local ethics committee. Thirteen healthy kidney donors and their corresponding recipients underwent kidney BOLD MR imaging with a 3-T imager. Written informed consent was obtained from each subject. BOLD MR imaging was performed in donors before uninephrectomy and in donors and recipients 8 days, 3 months, and 12 months after transplantation. R2* values, which are inversely related to tissue partial pressure of oxygen, were determined in the cortex and medulla. Longitudinal R2* changes were statistically analyzed by using repeated measures one-way analysis of variance with post hoc pair-wise comparisons. Results R2* values in the remaining kidneys significantly decreased early after uninephrectomy in both the medulla and cortex (P < .003), from 28.9 sec(-1) ± 2.3 to 26.4 sec(-1) ± 2.5 in the medulla and from 18.3 sec(-1) ± 1.5 to 16.3 sec(-1) ± 1.0 in the cortex, indicating increased oxygen content. In donors, R2* remained significantly decreased in both the medulla and cortex at 3 (P < .01) and 12 (P < .01) months. In transplanted kidneys, R2* remained stable during the first year after transplantation, with no significant change. Among donors, cortical R2* was found to be negatively correlated with estimated glomerular filtration rate (R = -0.47, P < .001). Conclusion The results suggest that BOLD MR imaging may potentially be used to monitor renal functional changes in both remaining and corresponding transplanted kidneys. (©) RSNA, 2016.
Resumo:
The objective of this survey was to determine herd level risk factors for mortality, unwanted early slaughter, and metaphylactic application of antimicrobial group therapy in Swiss veal calves in 2013. A questionnaire regarding farm structure, farm management, mortality and antimicrobial use was sent to all farmers registered in a Swiss label program setting requirements for improved animal welfare and sustainability. Risk factors were determined by multivariable logistic regression. A total of 619 veal producers returned a useable questionnaire (response rate=28.5%), of which 40.9% only fattened their own calves (group O), 56.9% their own calves and additional purchased calves (group O&P), and 2.3% only purchased calves for fattening (group P). A total number of 19,077 calves entered the fattening units in 2013, of which 21.7%, 66.7%, and 11.6% belonged to groups O, O&P, and P, respectively. Mortality was 0% in 322 herds (52.0%), between 0% and 3% in 47 herds (7.6%), and ≥3% in 250 herds (40.4%). Significant risk factors for mortality were purchasing calves, herd size, higher incidence of BRD, and access to an outside pen. Metaphylaxis was used on 13.4% of the farms (7.9% only upon arrival, 4.4% only later in the fattening period, 1.1% upon arrival and later), in 3.2% of the herds of group O, 17.9% of those in group O&P, and 92.9% of those of group P. Application of metaphylaxis upon arrival was positively associated with purchase (OR=8.9) and herd size (OR=1.2 per 10 calves). Metaphylaxis later in the production cycle was positively associated with group size (OR=2.9) and risk of respiratory disease (OR=1.2 per 10% higher risk) and negatively with the use of individual antimicrobial treatment (OR=0.3). In many countries, purchase and a large herd size are inherently connected to veal production. The Swiss situation with large commercial but also smaller herds with little or no purchase of calves made it possible to investigate the effect of these factors on mortality and antimicrobial drug use. The results of this study show that a system where small farms raise the calves from their own herds has a substantial potential to improve animal health and reduce antimicrobial drug use.
Resumo:
BACKGROUND: This study focused on the descriptive analysis of cattle movements and farm-level parameters derived from cattle movements, which are considered to be generically suitable for risk-based surveillance systems in Switzerland for diseases where animal movements constitute an important risk pathway. METHODS: A framework was developed to select farms for surveillance based on a risk score summarizing 5 parameters. The proposed framework was validated using data from the bovine viral diarrhoea (BVD) surveillance programme in 2013. RESULTS: A cumulative score was calculated per farm, including the following parameters; the maximum monthly ingoing contact chain (in 2012), the average number of animals per incoming movement, use of mixed alpine pastures and the number of weeks in 2012 a farm had movements registered. The final score for the farm depended on the distribution of the parameters. Different cut offs; 50, 90, 95 and 99%, were explored. The final scores ranged between 0 and 5. Validation of the scores against results from the BVD surveillance programme 2013 gave promising results for setting the cut off for each of the five selected farm level criteria at the 50th percentile. Restricting testing to farms with a score ≥ 2 would have resulted in the same number of detected BVD positive farms as testing all farms, i.e., the outcome of the 2013 surveillance programme could have been reached with a smaller survey. CONCLUSIONS: The seasonality and time dependency of the activity of single farms in the networks requires a careful assessment of the actual time period included to determine farm level criteria. However, selecting farms in the sample for risk-based surveillance can be optimized with the proposed scoring system. The system was validated using data from the BVD eradication program. The proposed method is a promising framework for the selection of farms according to the risk of infection based on animal movements.