948 resultados para Multicast Application Level
Resumo:
Objectives: In fast ball sports like beach volleyball, decision-making skills are a determining factor for excellent performance. The current investigation aimed to identify factors that influence the decisionmaking process in top-level beach volleyball defense in order to find relevant aspects for further research. For this reason, focused interviews with top players in international beach volleyball were conducted and analyzed with respect to decision-making characteristics. Design: Nineteen world-tour beach volleyball defense players, including seven Olympic or world champions, were interviewed, focusing on decision-making factors, gaze behavior, and interactions between the two. Methods: Verbal data were analyzed by inductive content analysis according to Mayring (2008). This approach allows categories to emerge from the interview material itself instead of forcing data into preset classifications and theoretical concepts. Results: The data analysis showed that, for top-level beach volleyball defense, decision making depends on opponent specifics, external context, situational context, opponent's movements, and intuition. Information on gaze patterns and visual cues revealed general tendencies indicating optimal gaze strategies that support excellent decision making. Furthermore, the analysis highlighted interactions between gaze behavior, visual information, and domain-specific knowledge. Conclusions: The present findings provide information on visual perception, domain-specific knowledge, and interactions between the two that are relevant for decision making in top-level beach volleyball defense. The results can be used to inform sports practice and to further untangle relevant mechanisms underlying decision making in complex game situations.
Resumo:
Human risk taking is characterized by a large amount of individual heterogeneity. In this study, we applied resting-state electroencephalography, which captures stable individual differences in neural activity, before subjects performed a risk-taking task. Using a source-localization technique, we found that the baseline cortical activity in the right prefrontal cortex predicts individual risk-taking behavior. Individuals with higher baseline cortical activity in this brain area display more risk aversion than do other individuals. This finding demonstrates that neural characteristics that are stable over time can predict a highly complex behavior such as risk-taking behavior and furthermore suggests that hypoactivity in the right prefrontal cortex might serve as a dispositional indicator of lower regulatory abilities, which is expressed in greater risk-taking behavior.
Resumo:
OBJECTIVES This study aimed to update the Logistic Clinical SYNTAX score to predict 3-year survival after percutaneous coronary intervention (PCI) and compare the performance with the SYNTAX score alone. BACKGROUND The SYNTAX score is a well-established angiographic tool to predict long-term outcomes after PCI. The Logistic Clinical SYNTAX score, developed by combining clinical variables with the anatomic SYNTAX score, has been shown to perform better than the SYNTAX score alone in predicting 1-year outcomes after PCI. However, the ability of this score to predict long-term survival is unknown. METHODS Patient-level data (N = 6,304, 399 deaths within 3 years) from 7 contemporary PCI trials were analyzed. We revised the overall risk and the predictor effects in the core model (SYNTAX score, age, creatinine clearance, and left ventricular ejection fraction) using Cox regression analysis to predict mortality at 3 years. We also updated the extended model by combining the core model with additional independent predictors of 3-year mortality (i.e., diabetes mellitus, peripheral vascular disease, and body mass index). RESULTS The revised Logistic Clinical SYNTAX models showed better discriminative ability than the anatomic SYNTAX score for the prediction of 3-year mortality after PCI (c-index: SYNTAX score, 0.61; core model, 0.71; and extended model, 0.73 in a cross-validation procedure). The extended model in particular performed better in differentiating low- and intermediate-risk groups. CONCLUSIONS Risk scores combining clinical characteristics with the anatomic SYNTAX score substantially better predict 3-year mortality than the SYNTAX score alone and should be used for long-term risk stratification of patients undergoing PCI.
Resumo:
OBJECTIVE To investigate the long-term prognostic implications of coronary calcification in patients undergoing percutaneous coronary intervention for obstructive coronary artery disease. METHODS Patient-level data from 6296 patients enrolled in seven clinical drug-eluting stents trials were analysed to identify in angiographic images the presence of severe coronary calcification by an independent academic research organisation (Cardialysis, Rotterdam, The Netherlands). Clinical outcomes at 3-years follow-up including all-cause mortality, death-myocardial infarction (MI), and the composite end-point of all-cause death-MI-any revascularisation were compared between patients with and without severe calcification. RESULTS Severe calcification was detected in 20% of the studied population. Patients with severe lesion calcification were less likely to have undergone complete revascularisation (48% vs 55.6%, p<0.001) and had an increased mortality compared with those without severely calcified arteries (10.8% vs 4.4%, p<0.001). The event rate was also high in patients with severely calcified lesions for the combined end-point death-MI (22.9% vs 10.9%; p<0.001) and death-MI- any revascularisation (31.8% vs 22.4%; p<0.001). On multivariate Cox regression analysis, including the Syntax score, the presence of severe coronary calcification was an independent predictor of poor prognosis (HR: 1.33 95% CI 1.00 to 1.77, p=0.047 for death; 1.23, 95% CI 1.02 to 1.49, p=0.031 for death-MI, and 1.18, 95% CI 1.01 to 1.39, p=0.042 for death-MI- any revascularisation), but it was not associated with an increased risk of stent thrombosis. CONCLUSIONS Patients with severely calcified lesions have worse clinical outcomes compared to those without severe coronary calcification. Severe coronary calcification appears as an independent predictor of worse prognosis, and should be considered as a marker of advanced atherosclerosis.
Resumo:
Research question: International and national sport federations as well as their member organisations (usually sport clubs) are key actors within the sport system and have a wide range of relationships outside the sport system (e.g. with the state, sponsors, and the media). They are currently facing major challenges such as growing competition in top-level sports, democratisation of sports with “sports for all” and sports as the answer to social problems (integration, education, health, unemployment, etc.). In this context, professionalising sport organisations seems to be an appropriate strategy to face these challenges and solve current problems. We define the professionalisation of sport organisations as an organisational process of transformation leading towards organisational rationalisation, efficiency and business-like management. This has led to a profound organisational change, particularly within sport federations, characterised by the strengthening of institutional management (managerialism) and the implementation of efficiency-based management instruments and paid staff. Research methods: The goal of this article is to review the international literature and establish a global understanding of and theoretical framework for how sport organisations professionalise and what consequences this may have. Results and Findings: Our multi-level approach based on the social theory of action integrates the current concepts for analysing professionalisation in sport federations. We specify the framework for the following research perspectives: (1) forms, (2) causes and (3) consequences, and discuss the reciprocal relations between sport federations and their member organisations in this context. Implications: Finally, we derive general methodological consequences for the investigation of professionalisation processes in sport organisations.
Resumo:
Purpose.This retrospective cohort study evaluated factors for peri-implant bone level changes (ΔIBL) associated with an implant type with inner-cone implant-abutment connection, rough neck surface, and platform switching (AT). Materials and Methods. All AT placed at the Department of Prosthodontics of the University of Bern between January 2004 and December 2005 were included in this study. All implants were examined by single radiographs using the parallel technique taken at surgery (T0) and obtained at least 6 months after surgery (T1). Possible influencing factors were analysed first using t-test (normal distribution) or the nonparametric Wilcoxon test (not normal distribution), and then a mixed model q variance analysis was performed. Results. 43 patients were treated with 109 implants. Five implants in 2 patients failed (survival rate: 95.4%).Mean ΔIBL in group 1 (T1: 6–12 months after surgery) was −0.65 ± 0.82mm and −0.69 ± 0.82mm in group 2 (T1: >12 months after surgery) (
Resumo:
The association between helmet use during alpine skiing and incidence and severity of head injuries was analyzed. All patients admitted to a level 1 trauma center for traumatic brain injuries (TBIs) sustained from skiing accidents during the seasons 2000-2001 and 2010-2011 were eligible. Primary outcome was the association between helmet use and severity of TBI measured by Glasgow Coma Scale (GCS), computed tomography (CT) results, and necessity of neurosurgical intervention. Of 1362 patients injured during alpine skiing, 245 (18%) sustained TBI and were included. TBI was fatal in 3%. Head injury was in 76% minor (Glasgow Coma Scale, 13-15), 6% moderate, and 14% severe. Number and percentage of TBI patients showed no significant trend over the investigated seasons. Forty-five percent of the 245 patients had pathological CT findings and 26% of these required neurosurgical intervention. Helmet use increased from 0% in 2000-2001 to 71% in 2010-2011 (p<0.001). The main analysis, comparing TBI in patients with or without a helmet, showed an adjusted odds ratio (OR) of 1.44 (p=0.430) for suffering moderate-to-severe head injury in helmet users. Analyses comparing off-piste to on-slope skiers revealed a significantly increased OR among off-piste skiers of 7.62 (p=0.004) for sustaining a TBI requiring surgical intervention. Despite increases in helmet use, we found no decrease in severe TBI among alpine skiers. Logistic regression analysis showed no significant difference in TBI with regard to helmet use, but increased risk for off-piste skiers. The limited protection of helmets and dangers of skiing off-piste should be targeted by prevention programs.
Resumo:
Pentatricopeptide repeat domain protein 1 (PTCD1) is a novel human protein that was recently shown to decrease the levels of mitochondrial leucine tRNAs. The physiological role of this regulation, however, remains unclear. Here we show that amino acid starvation by leucine deprivation significantly increased the mRNA steady-state levels of PTCD1 in human hepatocarcinoma (HepG2) cells. Amino acid starvation also increased the mitochondrially encoded leucine tRNA (tRNA(Leu(CUN))) and the mRNA for the mitochondrial leucyl-tRNA synthetase (LARS2). Despite increased PTCD1 mRNA steady-state levels, amino acid starvation decreased PTCD1 on the protein level. Decreasing PTCD1 protein concentration increases the stability of the mitochondrial leucine tRNAs, tRNA(Leu(CUN)) and tRNA(Leu(UUR)) as could be shown by RNAi experiments against PTCD1. Therefore, it is likely that decreased PTCD1 protein contributes to the increased tRNA(Leu(CUN)) levels in amino acid-starved cells. The stabilisation of the mitochondrial leucine tRNAs and the upregulation of the mitochondrial leucyl-tRNA synthetase LARS2 might play a role in adaptation of mitochondria to amino acid starvation.
Resumo:
QUESTIONS UNDER STUDY: Patient characteristics and risk factors for death of Swiss trauma patients in the Trauma Audit and Research Network (TARN). METHODS: Descriptive analysis of trauma patients (≥16 years) admitted to a level I trauma centre in Switzerland (September 1, 2009 to August 31, 2010) and entered into TARN. Multivariable logistic regression analysis was used to identify predictors of 30-day mortality. RESULTS: Of 458 patients 71% were male. The median age was 50.5 years (inter-quartile range [IQR] 32.2-67.7), median Injury Severity Score (ISS) was 14 (IQR 9-20) and median Glasgow Coma Score (GCS) was 15 (IQR 14-15). The ISS was >15 for 47%, and 14% had an ISS >25. A total of 17 patients (3.7%) died within 30 days of trauma. All deaths were in patients with ISS >15. Most injuries were due to falls <2 m (35%) or road traffic accidents (29%). Injuries to the head (39%) were followed by injuries to the lower limbs (33%), spine (28%) and chest (27%). The time of admission peaked between 12:00 and 22:00, with a second peak between 00:00 and 02:00. A total of 64% of patients were admitted directly to our trauma centre. The median time to CT was 30 min (IQR 18-54 min). Using multivariable regression analysis, the predictors of mortality were older age, higher ISS and lower GCS. CONCLUSIONS: Characteristics of Swiss trauma patients derived from TARN were described for the first time, providing a detailed overview of the institutional trauma population. Based on these results, patient management and hospital resources (e.g. triage of patients, time to CT, staffing during night shifts) could be evaluated as a further step.
Resumo:
Policy actors tend to misinterpret and distrust opponents in policy processes. This phenomenon, known as the “devil shift”, consists of the following two dimensions: actors perceive opponents as more powerful and as more evil than they really are. Analysing nine policy processes in Switzerland, this article highlights the drivers of the devil shift at two levels. On the actor level, interest groups, political parties and powerful actors suffer more from the devil shift than state actors and powerless actors. On the process level, the devil shift is stronger in policy processes dealing with socio-economic issues as compared with other issues. Finally, and in line with previous studies, there is less empirical evidence of the power dimension of the devil shift phenomenon than of its evilness dimension.
Resumo:
OBJECTIVES In Europe and elsewhere, health inequalities among HIV-positive individuals are of concern. We investigated late HIV diagnosis and late initiation of combination antiretroviral therapy (cART) by educational level, a proxy of socioeconomic position. DESIGN AND METHODS We used data from nine HIV cohorts within COHERE in Austria, France, Greece, Italy, Spain and Switzerland, collecting data on level of education in categories of the UNESCO/International Standard Classification of Education standard classification: non-completed basic, basic, secondary and tertiary education. We included individuals diagnosed with HIV between 1996 and 2011, aged at least 16 years, with known educational level and at least one CD4 cell count within 6 months of HIV diagnosis. We examined trends by education level in presentation with advanced HIV disease (AHD) (CD4 <200 cells/μl or AIDS within 6 months) using logistic regression, and distribution of CD4 cell count at cART initiation overall and among presenters without AHD using median regression. RESULTS Among 15 414 individuals, 52, 45,37, and 31% with uncompleted basic, basic, secondary and tertiary education, respectively, presented with AHD (P trend <0.001). Compared to patients with tertiary education, adjusted odds ratios of AHD were 1.72 (95% confidence interval 1.48-2.00) for uncompleted basic, 1.39 (1.24-1.56) for basic and 1.20 (1.08-1.34) for secondary education (P < 0.001). In unadjusted and adjusted analyses, median CD4 cell count at cART initiation was lower with poorer educational level. CONCLUSIONS Socioeconomic inequalities in delayed HIV diagnosis and initiation of cART are present in European countries with universal healthcare systems and individuals with lower educational level do not equally benefit from timely cART initiation.