15 resultados para salpetrige Säure

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: The aim of the present work was to verify whether calculating a ratio between clotting times obtained with the sensitive PTT-LA and a less sensitive activated partial thromboplastin time (aPTT)-reagent may represent a valuable aPTT-based screening strategy for lupus anticoagulants (LA). Methods: For the pilot study, plasma samples from normal subjects (n = 15) and from patients with LA (n = 10), therapeutic anticoagulation with vitamin K-antagonists (VKA) (n = 15) or unfractionated heparin (n = 15), coagulation factors deficiency (n = 16), and inhibitory antibodies against factor VIII or IX (n = 11) were studied. For the evaluation study, 1553 consecutive plasma samples from nonanticoagulated patients investigated for LA between January 2005 and December 2007 at our institution were studied. Following screening strategies were employed: Pathromtin-SL (aPTT-SL), PTT-LA (aPTT-LA), ratio aPTT-LA/aPTT-SL (aPTT-ratio), and Russell's viper venom (RVV) based LA-Check. LA positive samples were identified by mixing studies and diluted RVV confirmation test (LA-Check/LA-Sure). Results: Pilot study: All screening strategies had a 100% sensitivity, and the aPTT-ratio reached the highest specificity (82%; 95%CI: 74-90%). Within the evaluation study, following sensitivities for LA screening were observed: aPTT-SL 59.0% (95%CI: 57-61%), aPTT-LA 82.1% (95%CI: 80-84%), aPTT-ratio 92.3% (95%CI: 91-94), and LA-Check 83.3% (95%CI: 82-85%). Conclusion: Calculating a ratio between the LA-sensitive PTT-LA and the less sensitive Pathromtin-SL improves the performance of the PTT-LA itself and represents a simple and sensitive aPTT-based integrated strategy for LA screening.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nowadays surgical intervention is an essential part of the treatment of idiopathic gynecomastia. Choosing the right method is crucial and is based on the current status in the clinical and histological evaluation. Before finalizing the process of choosing a specific method a prior interdisciplinary evaluation of the patient is necessary to ascertain clear indications for a surgical intervention. Liposuction is one of the methods which have become popular in recent years. The advantages are the possible combination with traditional techniques, such as subcutaneous mastectomy or periareolar mastopexy. The main indication is for gynecomastia stage IIa/b and is justifiable due to the reduction in surgical complications and scarring. Furthermore this technique provides an excellent aesthetical outcome for the patient. A total of 162 patients suffering from gynecomastia stages I-III (according to Simon) were surgically treated between 2000 and 2010 and these cases were retrospectively evaluated. The results showed a decline in the use of a T-shaped incision in combination with subcutaneous mastectomy with periareolar tightening compared to an increase in the use of subcutaneous mastectomy in combination with liposuction. The excised tissue should always be sent for histological examination to make sure no malignant cells were present.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: Description of a cat with ischemic muscle necrosis that suffered from cardiopulmonary arrest due to hyperkalemia. Pathogenesis, clinical signs and therapy of ischemic muscle necrosis are discussed and possible causes, symptoms and treatment of hyperkalemia are shown. Material and methods: case report of a four-year-old male castrated domestic shorthair cat. Results: The cat was successfully resuscitated and hyperkalemia was treated with different treatment modalities. Conclusion: Ischemic muscle necrosis can lead to severe live-threatening hyperkalemia which has to be anticipated, monitored and treated adequately. Aggressive fluid therapy might be responsible for a higher risk of hyperkalemia in predisposed cases. Clinical relevance: Potassium concentrations and acid-base disturbances must be closely monitored in patients with ischemic muscle necrosis

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: We aimed to study the incidence and outcome of severe traumatic brain injury (TBI) in Switzerland and to test the feasibility of a large cohort study with case identification in the first 24 hours and 6-month follow-up. METHODS: From January to June 2005, we consecutively enrolled and followed up all persons with severe TBI (Abbreviated Injury Score of the head region >3 and Glasgow Coma Scale <9) in the catchment areas of 3 Swiss medical centres with neurosurgical facilities. The primary outcome was the Extended Glasgow Outcome Scale (GOSE) after 6 months. Secondary outcomes included survival, Functional Independence Mea - sure (FIM), and health-related quality of life (SF-12) at defined time-points up to 6 months after injury. RESULTS: We recruited 101 participants from a source population of about 2.47 million (ie, about 33% of Swiss population). The incidence of severe TBI was 8.2 per 100,000 person-years. The overall case fatality was 70%: 41 of 101 persons (41%) died at the scene of the accident. 23 of 60 hospitalised participants (38%) died within 48 hours, and 31 (53%) within 6 months. In all hospitalised patients, the median GOSE was 1 (range 1-8) after 6 months, and was 6 (2-8) in 6-month survivors. The median total FIM score was 125 (range 18-126); median-SF-12 component mea - sures were 44 (25-55) for the physical scale and 52 (32-65) for the mental scale. CONCLUSIONS: Severe TBI was associated with high case fatality and considerable morbidity in survivors. We demonstrated the feasibility of a multicentre cohort study in Switzerland with the aim of identifying modifiable determinants of outcome and improving current trauma care.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We prove analogs of classical almost sure dimension theorems for Euclidean projection mappings in the first Heisenberg group, equipped with a sub-Riemannian metric.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose a method that robustly combines color and feature buffers to denoise Monte Carlo renderings. On one hand, feature buffers, such as per pixel normals, textures, or depth, are effective in determining denoising filters because features are highly correlated with rendered images. Filters based solely on features, however, are prone to blurring image details that are not well represented by the features. On the other hand, color buffers represent all details, but they may be less effective to determine filters because they are contaminated by the noise that is supposed to be removed. We propose to obtain filters using a combination of color and feature buffers in an NL-means and cross-bilateral filtering framework. We determine a robust weighting of colors and features using a SURE-based error estimate. We show significant improvements in subjective and quantitative errors compared to the previous state-of-the-art. We also demonstrate adaptive sampling and space-time filtering for animations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the last decade European democracies have been facing a challenge by the rising force of new populist movements. The emergence of the financial and sovereign debt crisis in Europe created new fertile soil for the strengthening of old-established – and the development of new – populist parties in several EU-member states. José Manuel Barroso, president of the European Commission, emphasized his increased unease concerning these developments when he was speaking at the annual Brussels Think Tank Forum on 22. April 2013: “I am deeply concerned about the divisions that we see emerging: political extremes and populism tearing apart the political support and the social fabric that we need to deal with the crisis; […]” (Barroso 2013). Indeed, European elites seem to be increasingly worried by these recent developments which are perceived as an impending stress test of the Union and the project of European integration as a whole (Hartleb 2013). Sure enough, the results of the recent European Parliament Elections 2014 revealed a great support for populist political parties in many societies of EU-member countries. To understand the success of populist parties in Europe it is crucial to first shed light on the nature of populist party communication itself. Significant communicative differences may explain the varying success of populist parties between and within countries, while a pure demand-side approach (i.e. a focus on the preferences of the electorate) often fails to do so (Mudde 2010). The aim of this study is therefore to analyse what different types of populist communication styles emerge during the EP election campaign 2014 and under which conditions populist communication styles are selected by political parties. So far, the empirical measurement of populism has received only scarce attention (Rooduijn & Pauwels 2011). Besides, most of the existing empirical investigations of populism are single case studies (Albertazzi & McDonnell 2008) and scholars have not yet developed systematic methods to measure populism in a comparative way (Rooduijn & Pauwels 2011). This is a consequence of a lack of conceptual clarity which goes along with populism (Taggart 2000; Barr 2009; Canovan 1999) due to its contextual sensitivity. Hence, populism in Europe should be analysed in a way that clarifies the concept of populism and moreover takes into account that the Europeanization of politics has an influence on the type of populist party communication, which is intended in the course of that study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Superresolution from plenoptic cameras or camera arrays is usually treated similarly to superresolution from video streams. However, the transformation between the low-resolution views can be determined precisely from camera geometry and parallax. Furthermore, as each low-resolution image originates from a unique physical camera, its sampling properties can also be unique. We exploit this option with a custom design of either the optics or the sensor pixels. This design makes sure that the sampling matrix of the complete system is always well-formed, enabling robust and high-resolution image reconstruction. We show that simply changing the pixel aspect ratio from square to anamorphic is sufficient to achieve that goal, as long as each camera has a unique aspect ratio. We support this claim with theoretical analysis and image reconstruction of real images. We derive the optimal aspect ratios for sets of 2 or 4 cameras. Finally, we verify our solution with a camera system using an anamorphic lens.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Neuropsychologists often face interpretational difficulties when assessing cognitive deficits, particularly in cases of unclear cerebral etiology. How can we be sure whether a single test score below the population average is indicative of a pathological brain condition or normal? In the past few years, the topic of intra-individual performance variability has gained great interest. On the basis of a large normative sample, two measures of performance variability and their importance for neuropsychological interpretation will be presented in this paper: the number of low scores and the level of dispersion.We conclude that low scores are common in healthy individuals. On the other hand, the level of dispersion is relatively small. Here, base rate information about abnormally low scores and abnormally high dispersion across cognitive abilities are providedto improve the awareness of normal variability and to serve clinicians as additional interpretive measures in the diagnostic process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE To determine if adequacy of randomisation and allocation concealment is associated with changes in effect sizes (ES) when comparing physical therapy (PT) trials with and without these methodological characteristics. DESIGN Meta-epidemiological study. PARTICIPANTS A random sample of randomised controlled trials (RCTs) included in meta-analyses in the PT discipline were identified. INTERVENTION Data extraction including assessments of random sequence generation and allocation concealment was conducted independently by two reviewers. To determine the association between sequence generation, and allocation concealment and ES, a two-level analysis was conducted using a meta-meta-analytic approach. PRIMARY AND SECONDARY OUTCOME MEASURES association between random sequence generation and allocation concealment and ES in PT trials. RESULTS 393 trials included in 43 meta-analyses, analysing 44 622 patients contributed to this study. Adequate random sequence generation and appropriate allocation concealment were accomplished in only 39.7% and 11.5% of PT trials, respectively. Although trials with inappropriate allocation concealment tended to have an overestimate treatment effect when compared with trials with adequate concealment of allocation, the difference was non-statistically significant (ES=0.12; 95% CI -0.06 to 0.30). When pooling our results with those of Nuesch et al, we obtained a pooled statistically significant value (ES=0.14; 95% CI 0.02 to 0.26). There was no difference in ES in trials with appropriate or inappropriate random sequence generation (ES=0.02; 95% CI -0.12 to 0.15). CONCLUSIONS Our results suggest that when evaluating risk of bias of primary RCTs in PT area, systematic reviewers and clinicians implementing research into practice should pay attention to these biases since they could exaggerate treatment effects. Systematic reviewers should perform sensitivity analysis including trials with low risk of bias in these domains as primary analysis and/or in combination with less restrictive analyses. Authors and editors should make sure that allocation concealment and random sequence generation are properly reported in trial reports.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study pathwise invariances and degeneracies of random fields with motivating applications in Gaussian process modelling. The key idea is that a number of structural properties one may wish to impose a priori on functions boil down to degeneracy properties under well-chosen linear operators. We first show in a second order set-up that almost sure degeneracy of random field paths under some class of linear operators defined in terms of signed measures can be controlled through the two first moments. A special focus is then put on the Gaussian case, where these results are revisited and extended to further linear operators thanks to state-of-the-art representations. Several degeneracy properties are tackled, including random fields with symmetric paths, centred paths, harmonic paths, or sparse paths. The proposed approach delivers a number of promising results and perspectives in Gaussian process modelling. In a first numerical experiment, it is shown that dedicated kernels can be used to infer an axis of symmetry. Our second numerical experiment deals with conditional simulations of a solution to the heat equation, and it is found that adapted kernels notably enable improved predictions of non-linear functionals of the field such as its maximum.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Trade in agriculture is linked to a whole range of economic, environmental, societal and future interests. For this reason, international regulation of trade in agricultural goods is highly contentious. While mainly directed towards an opening of markets, the WTO Agreement on Agriculture also has some entry points for ‘non trade concerns’. However, the agreement still looks like a casual patchwork that allows rather unsystematic ally for exemptions, without explicitly exposing the grounds that allow for them. The question arises of how the agreement could be drafted in a more structured way, in order to make sure that the economic objectives are efficiently pursued, and at the same time that human rights and environmental concerns are adequately taken account of? The concept of sustainable development provides for a methodical ‘seven step’ framework that gives guidance on integrated decision making processes. In this paper, this framework is partially applied to the Agreement on Agriculture. This working paper served as an introductory note to a brainstorming workshop on the subject that took place on 27 March 2009 at the World Trade Institute, University of Bern.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Most managers see strategy development as serious business. It is ironic, then, that some of the most remarkable strategic breakthroughs in organizations emerge not from well-ordered processes but from messy, ambiguous and sometimes irrational activities - pursuits that can best be described as play. Referring to research in the fields of developmental psychology and anthropology, the authors argue that play can stimulate the development of cognitive, interpretive skills and engender an emotional sense of fulfillment. It can help establish a safe environment for introducing new ideas about market opportunities, generating debate about important strategic issues, challenging old assumptions and building a sense of common purpose. The authors draw on their own experiences working with managers at the Imagination Lab Foundation and Templeton College, Oxford University, and they make sure to point out that play is no substitute for rational, conventional strategy development. Indeed, after the creative sessions are over, plenty of hard work remains to translate the ideas and insights into processes and actions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a parallel surrogate-based global optimization method for computationally expensive objective functions that is more effective for larger numbers of processors. To reach this goal, we integrated concepts from multi-objective optimization and tabu search into, single objective, surrogate optimization. Our proposed derivative-free algorithm, called SOP, uses non-dominated sorting of points for which the expensive function has been previously evaluated. The two objectives are the expensive function value of the point and the minimum distance of the point to previously evaluated points. Based on the results of non-dominated sorting, P points from the sorted fronts are selected as centers from which many candidate points are generated by random perturbations. Based on surrogate approximation, the best candidate point is subsequently selected for expensive evaluation for each of the P centers, with simultaneous computation on P processors. Centers that previously did not generate good solutions are tabu with a given tenure. We show almost sure convergence of this algorithm under some conditions. The performance of SOP is compared with two RBF based methods. The test results show that SOP is an efficient method that can reduce time required to find a good near optimal solution. In a number of cases the efficiency of SOP is so good that SOP with 8 processors found an accurate answer in less wall-clock time than the other algorithms did with 32 processors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Specification consortia and standardization bodies concentrate on e-Learning objects to en-sure reusability of content. Learning objects may be collected in a library and used for deriv-ing course offerings that are customized to the needs of different learning communities. How-ever, customization of courses is possible only if the logical dependencies between the learn-ing objects are known. Metadata for describing object relationships have been proposed in several e-Learning specifications. This paper discusses the customization potential of e-Learning objects but also the pitfalls that exist if content is customized inappropriately.