57 resultados para Uncertainty with Respect to the Future


Relevância:

100.00% 100.00%

Publicador:

Resumo:

STUDY DESIGN: A prospective case control study design was conducted. OBJECTIVES: The purpose of the current study was to determine the intraoperative radiation hazard to spine surgeons by occupational radiation exposure during percutaneous vertebroplasty and possible consequences with respect to radiation protection. SUMMARY OF BACKGROUND DATA: The development of minimally invasive surgery techniques has led to an increasing number of fluoroscopically guided procedures being done percutaneously such as vertebroplasty, which is the percutaneous cement augmentation of vertebral bodies. METHODS: Three months of occupational dose data for two spine surgeons was evaluated measuring the radiation doses to the thyroid gland, the upper extremities, and the eyes during vertebroplasty. RESULTS: The annual risk of developing a fatal cancer of the thyroid is 0.0025%, which means a very small to small risk. The annual morbidity (the risk of developing a cancer including nonfatal ones) is 0.025%, which already means a small to medium risk. The dose for the eye lens was about 8% of the threshold dose to develop a radiation induced cataract (150 mSv); therefore, the risk is very low but not negligible. The doses measured for the skin are 10% of the annual effective dose limit (500 mSv) recommended by the ICRP (International Commission on Radiologic Protection); therefore, the annual risk for developing a fatal skin cancer is very low. CONCLUSION: While performing percutaneous vertebroplasty, the surgeon is exposed to a significant amount of radiation. Proper surgical technique and shielding devices to decrease potentially high morbidity are mandatory. Training in radiation protection should be an integral part of the education for all surgeons using minimally invasive radiologic-guided interventional techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When we actively explore the visual environment, our gaze preferentially selects regions characterized by high contrast and high density of edges, suggesting that the guidance of eye movements during visual exploration is driven to a significant degree by perceptual characteristics of a scene. Converging findings suggest that the selection of the visual target for the upcoming saccade critically depends on a covert shift of spatial attention. However, it is unclear whether attention selects the location of the next fixation uniquely on the basis of global scene structure or additionally on local perceptual information. To investigate the role of spatial attention in scene processing, we examined eye fixation patterns of patients with spatial neglect during unconstrained exploration of natural images and compared these to healthy and brain-injured control participants. We computed luminance, colour, contrast, and edge information contained in image patches surrounding each fixation and evaluated whether they differed from randomly selected image patches. At the global level, neglect patients showed the characteristic ipsilesional shift of the distribution of their fixations. At the local level, patients with neglect and control participants fixated image regions in ipsilesional space that were closely similar with respect to their local feature content. In contrast, when directing their gaze to contralesional (impaired) space neglect patients fixated regions of significantly higher local luminance and lower edge content than controls. These results suggest that intact spatial attention is necessary for the active sampling of local feature content during scene perception.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large progress has been made in the past few years towards quantifying and understanding climate variability during past centuries. At the same time, present-day climate has been studied using state-of-the-art data sets and tools with respect to the physical and chemical mechanisms governing climate variability. Both the understanding of the past and the knowledge of the processes are important for assessing and attributing the anthropogenic effect on present and future climate. The most important time period in this context is the past approximately 100 years, which comprises large natural variations and extremes (such as long droughts) as well as anthropogenic influences, most pronounced in the past few decades. Recent and ongoing research efforts steadily improve the observational record of the 20th century, while atmospheric circulation models are used to underpin the mechanisms behind large climatic variations. Atmospheric chemistry and composition are important for understanding climate variability and change, and considerable progress has been made in the past few years in this field. The evolving integration of these research areas in a more comprehensive analysis of recent climate variability was reflected in the organisation of a workshop “Climate variability and extremes in the past 100 years” in Gwatt near Thun (Switzerland), 24–26 July 2006. The aim of this workshop was to bring together scientists working on data issues together with statistical climatologists, modellers, and atmospheric chemists to discuss gaps in our understanding of climate variability during the past approximately 100 years.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a comparison of principal component (PC) regression and regularized expectation maximization (RegEM) to reconstruct European summer and winter surface air temperature over the past millennium. Reconstruction is performed within a surrogate climate using the National Center for Atmospheric Research (NCAR) Climate System Model (CSM) 1.4 and the climate model ECHO-G 4, assuming different white and red noise scenarios to define the distortion of pseudoproxy series. We show how sensitivity tests lead to valuable “a priori” information that provides a basis for improving real world proxy reconstructions. Our results emphasize the need to carefully test and evaluate reconstruction techniques with respect to the temporal resolution and the spatial scale they are applied to. Furthermore, we demonstrate that uncertainties inherent to the predictand and predictor data have to be more rigorously taken into account. The comparison of the two statistical techniques, in the specific experimental setting presented here, indicates that more skilful results are achieved with RegEM as low frequency variability is better preserved. We further detect seasonal differences in reconstruction skill for the continental scale, as e.g. the target temperature average is more adequately reconstructed for summer than for winter. For the specific predictor network given in this paper, both techniques underestimate the target temperature variations to an increasing extent as more noise is added to the signal, albeit RegEM less than with PC regression. We conclude that climate field reconstruction techniques can be improved and need to be further optimized in future applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a vertically resolved zonal mean monthly mean global ozone data set spanning the period 1901 to 2007, called HISTOZ.1.0. It is based on a new approach that combines information from an ensemble of chemistry climate model (CCM) simulations with historical total column ozone information. The CCM simulations incorporate important external drivers of stratospheric chemistry and dynamics (in particular solar and volcanic effects, greenhouse gases and ozone depleting substances, sea surface temperatures, and the quasi-biennial oscillation). The historical total column ozone observations include ground-based measurements from the 1920s onward and satellite observations from 1970 to 1976. An off-line data assimilation approach is used to combine model simulations, observations, and information on the observation error. The period starting in 1979 was used for validation with existing ozone data sets and therefore only ground-based measurements were assimilated. Results demonstrate considerable skill from the CCM simulations alone. Assimilating observations provides additional skill for total column ozone. With respect to the vertical ozone distribution, assimilating observations increases on average the correlation with a reference data set, but does not decrease the mean squared error. Analyses of HISTOZ.1.0 with respect to the effects of El Niño–Southern Oscillation (ENSO) and of the 11 yr solar cycle on stratospheric ozone from 1934 to 1979 qualitatively confirm previous studies that focussed on the post-1979 period. The ENSO signature exhibits a much clearer imprint of a change in strength of the Brewer–Dobson circulation compared to the post-1979 period. The imprint of the 11 yr solar cycle is slightly weaker in the earlier period. Furthermore, the total column ozone increase from the 1950s to around 1970 at northern mid-latitudes is briefly discussed. Indications for contributions of a tropospheric ozone increase, greenhouse gases, and changes in atmospheric circulation are found. Finally, the paper points at several possible future improvements of HISTOZ.1.0.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Fractures of the mandible (lower jaw) are a common occurrence and usually related to interpersonal violence or road traffic accidents. Mandibular fractures may be treated using open (surgical) and closed (non-surgical) techniques. Fracture sites are immobilized with intermaxillary fixation (IMF) or other external or internal devices (i.e. plates and screws) to allow bone healing. Various techniques have been used, however uncertainty exists with respect to the specific indications for each approach. OBJECTIVES The objective of this review is to provide reliable evidence of the effects of any interventions either open (surgical) or closed (non-surgical) that can be used in the management of mandibular fractures, excluding the condyles, in adult patients. SEARCH METHODS We searched the following electronic databases: the Cochrane Oral Health Group's Trials Register (to 28 February 2013), the Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library 2013, Issue 1), MEDLINE via OVID (1950 to 28 February 2013), EMBASE via OVID (1980 to 28 February 2013), metaRegister of Controlled Trials (to 7 April 2013), ClinicalTrials.gov (to 7 April 2013) and the WHO International Clinical Trials Registry Platform (to 7 April 2013). The reference lists of all trials identified were checked for further studies. There were no restrictions regarding language or date of publication. SELECTION CRITERIA Randomised controlled trials evaluating the management of mandibular fractures without condylar involvement. Any studies that compared different treatment approaches were included. DATA COLLECTION AND ANALYSIS At least two review authors independently assessed trial quality and extracted data. Results were to be expressed as random-effects models using mean differences for continuous outcomes and risk ratios for dichotomous outcomes with 95% confidence intervals. Heterogeneity was to be investigated to include both clinical and methodological factors. MAIN RESULTS Twelve studies, assessed as high (six) and unclear (six) risk of bias, comprising 689 participants (830 fractures), were included. Interventions examined different plate materials and morphology; use of one or two lag screws; microplate versus miniplate; early and delayed mobilization; eyelet wires versus Rapid IMF™ and the management of angle fractures with intraoral access alone or combined with a transbuccal approach. Patient-oriented outcomes were largely ignored and post-operative pain scores were inadequately reported. Unfortunately, only one or two trials with small sample sizes were conducted for each comparison and outcome. Our results and conclusions should therefore be interpreted with caution. We were able to pool the results for two comparisons assessing one outcome. Pooled data from two studies comparing two miniplates versus one miniplate revealed no significant difference in the risk of post-operative infection of surgical site (risk ratio (RR) 1.32, 95% CI 0.41 to 4.22, P = 0.64, I(2) = 0%). Similarly, no difference in post-operative infection between the use of two 3-dimensional (3D) and standard (2D) miniplates was determined (RR 1.26, 95% CI 0.19 to 8.13, P = 0.81, I(2) = 27%). The included studies involved a small number of participants with a low number of events. AUTHORS' CONCLUSIONS This review illustrates that there is currently inadequate evidence to support the effectiveness of a single approach in the management of mandibular fractures without condylar involvement. The lack of high quality evidence may be explained by clinical diversity, variability in assessment tools used and difficulty in grading outcomes with existing measurement tools. Until high level evidence is available, treatment decisions should continue to be based on the clinician's prior experience and the individual circumstances.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hereditary nasal parakeratosis (HNPK), an inherited monogenic autosomal recessive skin disorder, leads to crusts and fissures on the nasal planum of Labrador Retrievers. We performed a genome-wide association study (GWAS) using 13 HNPK cases and 23 controls. We obtained a single strong association signal on chromosome 2 (p(raw) = 4.4×10⁻¹⁴). The analysis of shared haplotypes among the 13 cases defined a critical interval of 1.6 Mb with 25 predicted genes. We re-sequenced the genome of one case at 38× coverage and detected 3 non-synonymous variants in the critical interval with respect to the reference genome assembly. We genotyped these variants in larger cohorts of dogs and only one was perfectly associated with the HNPK phenotype in a cohort of more than 500 dogs. This candidate causative variant is a missense variant in the SUV39H2 gene encoding a histone 3 lysine 9 (H3K9) methyltransferase, which mediates chromatin silencing. The variant c.972T>G is predicted to change an evolutionary conserved asparagine into a lysine in the catalytically active domain of the enzyme (p.N324K). We further studied the histopathological alterations in the epidermis in vivo. Our data suggest that the HNPK phenotype is not caused by hyperproliferation, but rather delayed terminal differentiation of keratinocytes. Thus, our data provide evidence that SUV39H2 is involved in the epigenetic regulation of keratinocyte differentiation ensuring proper stratification and tight sealing of the mammalian epidermis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Highland cattle with congenital crop ears have notches of variable size on the tips of both ears. In some cases, cartilage deformation can be seen and occasionally the external ears are shortened. We collected 40 cases and 80 controls across Switzerland. Pedigree data analysis confirmed a monogenic autosomal dominant mode of inheritance with variable expressivity. All affected animals could be traced back to a single common ancestor. A genome-wide association study was performed and the causative mutation was mapped to a 4 Mb interval on bovine chromosome 6. The H6 family homeobox 1 (HMX1) gene was selected as a positional and functional candidate gene. By whole genome re-sequencing of an affected Highland cattle, we detected 6 non-synonymous coding sequence variants and two variants in an ultra-conserved element at the HMX1 locus with respect to the reference genome. Of these 8 variants, only a non-coding 76 bp genomic duplication (g.106720058_106720133dup) located in the conserved region was perfectly associated with crop ears. The identified copy number variation probably results in HMX1 misregulation and possible gain-of-function. Our findings confirm the role of HMX1 during the development of the external ear. As it is sometimes difficult to phenotypically diagnose Highland cattle with slight ear notches, genetic testing can now be used to improve selection against this undesired trait.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We appreciate the comments and concerns expressed by Arakawa and colleagues regarding our article, titled “Pulsatile control of rotary blood pumps: Does the modulation waveform matter?”1 Unfortunately, we have to disagree with Arakawa and colleagues. As is obvious from the title of our article, it investigates the effect of different waveforms on the heart–device interaction. In contrast to the authors' claim, this is the first article in the literature that uses basic waveforms (sine, triangle, saw tooth, and rectangular) with different phase shifts to examines their impact on left ventricular unloading. The previous publications2, 3 and 4 just varied the pump speed during systole and diastole, which was first reported by Bearnson and associates5 in 1996, and studied its effect on aortic pressure, coronary flow, and end-diastolic volume. We should mention that dp/dtmax is a load-sensitive parameter of contractility and not representative for the degree of unloading. Moreover, none of the aforementioned reports has studied mechanical unloading and in particular the stroke work of the left ventricle. Our method is unique because we do not just alternate between high and low speed but have accurate control of the waveform because of the direct drive system of Levitronix Technologies LLC (Waltham, Mass) and a custom-developed pump controller. Without referring, Arakawa and associates state “several previous studies have already reported the coronary flow diminishes as the left ventricular assist device support increases.” It should be noted that all the waveforms used in our study have 2000 rpm average value with 1000 rpm amplitude, which is not an excessive speed for the CentriMag rotary pump (Levitronix) to collapse the ventricle and diminish the coronary flow. We agree with Arakawa and coworkers that there is a need for a heart failure model to come to more relevant results with respect to clinical expectations. However, we have explored many existing models, including species and breeds that have a native proneness to cardiomyopathy, but all of them differ from the genetic presentation in humans. We certainly do not believe that the use of microembolization, in which the coronary circulation is impaired by the injection of microspheres, would form a good model from which to draw conclusions about coronary flow change under different loading conditions. A model would be needed in which either an infarct is created to mimic ischemic heart failure or the coronary circulation remains untouched to simulate, for instance, dilated cardiomyopathy. Furthermore, in discussion we clearly mention that “lack of heart failure is a major limitation of our study.” We also believe that unloading is not the only factor of the cardiac functional recovery, and an excessive unloading of the left ventricle might lead to cardiac tissue atrophy. Therefore, in our article we mention that control of the level of cardiac unloading by assist devices has been suggested as a mechanical tool to promote recovery, and more studies are required to find better strategies for the speed modulation of rotary pumps and to achieve an optimal heart load control to enhance myocardial recovery. Finally, there are many publications about pulsing rotary blood pumps and it was impossible to include them all. We preferred to reference some of the earlier basic works such as an original research by Bearnson and coworkers5 and another article published by our group,6 which is more relevant.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The field of international relations has been obsessed with democracy and democratization and its effects on international cooperation for a long time. More recently, research has turned its focus on how international organizations enhance democracy. This article contributes to this debate and applies a prominent liberal framework to study the ‘outside-in’ effects of the World Trade Organization. The article offers a critical reading of democratization through IO membership. It provides for an assessment of the dominant framework put forward by Keohane et al. (2009). In doing so, it develops a set of empirical strategies to test conjectured causal mechanisms with respect to the WTO, and illustrates the potential application by drawing on selected empirical evidence from trade politics. Finally, it proposes a number of analytical revisions to the liberal framework and outlines avenues for future research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The drop in temperature following large volcanic eruptions has been identified as an important component of natural climate variability. However, due to the limited number of large eruptions that occurred during the period of instrumental observations, the precise amplitude of post-volcanic cooling is not well constrained. Here we present new evidence on summer temperature cooling over Europe in years following volcanic eruptions. We compile and analyze an updated network of tree-ring maximum latewood density chronologies, spanning the past nine centuries, and compare cooling signatures in this network with exceptionally long instrumental station records and state-of-the-art general circulation models. Results indicate post-volcanic June–August cooling is strongest in Northern Europe 2 years after an eruption (−0.52 ± 0.05 °C), whereas in Central Europe the temperature response is smaller and occurs 1 year after an eruption (−0.18 ± 0.07 °C). We validate these estimates by comparison with the shorter instrumental network and evaluate the statistical significance of post-volcanic summer temperature cooling in the context of natural climate variability over the past nine centuries. Finding no significant post-volcanic temperature cooling lasting longer than 2 years, our results question the ability of large eruptions to initiate long-term temperature changes through feedback mechanisms in the climate system. We discuss the implications of these findings with respect to the response seen in general circulation models and emphasize the importance of considering well-documented, annually dated eruptions when assessing the significance of volcanic forcing on continental-scale temperature variations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Whole Atmosphere Community Climate Model (WACCM) is utilised to study the daily ozone cycle and underlying photochemical and dynamical processes. The analysis is focused on the daily ozone cycle in the middle stratosphere at 5 hPa where satellite-based trend estimates of stratospheric ozone are most biased by diurnal sampling effects and drifting satellite orbits. The simulated ozone cycle shows a minimum after sunrise and a maximum in the late afternoon. Further, a seasonal variation of the daily ozone cycle in the stratosphere was found. Depending on season and latitude, the peak-to-valley difference of the daily ozone cycle varies mostly between 3 and 5% (0.4 ppmv) with respect to the midnight ozone volume mixing ratio. The maximal variation of 15% (0.8 ppmv) is found at the polar circle in summer. The global pattern of the strength of the daily ozone cycle is mainly governed by the solar zenith angle and the sunshine duration. In addition, we find synoptic-scale variations in the strength of the daily ozone cycle. These variations are often anti-correlated to regional temperature anomalies and are due to the temperature dependence of the rate coefficients k2 and k3 of the Chapman cycle reactions. Further, the NOx catalytic cycle counteracts the accumulation of ozone during daytime and leads to an anti-correlation between anomalies in NOx and the strength of the daily ozone cycle. Similarly, ozone recombines with atomic oxygen which leads to an anti-correlation between anomalies in ozone abundance and the strength of the daily ozone cycle. At higher latitudes, an increase of the westerly (easterly) wind cause a decrease (increase) in the sunshine duration of an air parcel leading to a weaker (stronger) daily ozone cycle.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In spring 2012 CERN provided two weeks of a short bunch proton beam dedicated to the neutrino velocity measurement over a distance of 730 km. The OPERA neutrino experiment at the underground Gran Sasso Laboratory used an upgraded setup compared to the 2011 measurements, improving the measurement time accuracy. An independent timing system based on the Resistive Plate Chambers was exploited providing a time accuracy of ∼1 ns. Neutrino and anti-neutrino contributions were separated using the information provided by the OPERA magnetic spectrometers. The new analysis profited from the precision geodesy measurements of the neutrino baseline and of the CNGS/LNGS clock synchronization. The neutrino arrival time with respect to the one computed assuming the speed of light in vacuum is found to be δtν≡TOFc−TOFν=(0.6±0.4 (stat.)±3.0 (syst.)) ns and δtν¯≡TOFc−TOFν¯=(1.7±1.4 (stat.)±3.1 (syst.)) ns for νμ and ν¯μ, respectively. This corresponds to a limit on the muon neutrino velocity with respect to the speed of light of −1.8×10−6<(vν−c)/c<2.3×10−6 at 90% C.L. This new measurement confirms with higher accuracy the revised OPERA result.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A search for the electroweak pair production of charged sleptons and weak gauginos decaying into final states with two leptons is performed using 4.7 fb(-1) of proton-proton collision data at root s = 7 TeV recorded with the ATLAS experiment at the Large Hadron Collider. No significant excesses are observed with respect to the prediction from Standard Model processes. In the scenario of direct slepton production, if the sleptons decay directly into the lightest neutralino, left-handed slepton masses between 85 and 195 GeV are excluded at 95% confidence level for a 20 GeV neutralino. Chargino masses between 110 and 340 GeV are excluded in the scenario of direct production of wino-like chargino pairs decaying into the lightest neutralino via an intermediate on-shell charged slepton for a 10 GeV neutralino. The results are also interpreted in the framework of the phenomenological minimal supersymmetric Standard Model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Measurements of the variation of inclusive jet suppression as a function of relative azimuthal angle, Delta phi, with respect to the elliptic event plane provide insight into the path-length dependence of jet quenching. ATLAS has measured the Delta phi dependence of jet yields in 0.14 nb(-1) of root s(NN) = 2.76 TeV Pb + Pb collisions at the LHC for jet transverse momenta p(T) > 45 GeV in different collision centrality bins using an underlying event subtraction procedure that accounts for elliptic flow. The variation of the jet yield with Delta phi was characterized by the parameter, nu(jet)(2), and the ratio of out-of-plane (Delta phi similar to pi/2) to in-plane (Delta phi similar to 0) yields. Nonzero nu(jet)(2) values were measured in all centrality bins for p(T) < 160 GeV. The jet yields are observed to vary by as much as 20% between in-plane and out-of-plane directions.