39 resultados para Ridley, Matt: Jalouden alkuperä
Resumo:
OBJECTIVE: Marfan syndrome is a systemic connective tissue disorder caused by mutations in the fibrillin-1 gene. It was originally believed that Marfan syndrome results exclusively from the production of abnormal fibrillin-1 that leads to structurally weaker connective tissue when incorporated into the extracellular matrix. This effect seemed to explain many of the clinical features of Marfan syndrome, including aortic root dilatation and acute aortic dissection, which represent the main causes of morbidity and mortality in Marfan syndrome. METHODS: Recent molecular studies, most based on genetically defined mouse models of Marfan syndrome, have challenged this paradigm. These studies established the critical contribution of fibrillin-1 haploinsufficiency and dysregulated transforming growth factor-beta signaling to disease progression. RESULTS: It seems that many manifestations of Marfan syndrome are less related to a primary structural deficiency of the tissues than to altered morphogenetic and homeostatic programs that are induced by altered transforming growth factor-beta signaling. Most important, transforming growth factor-beta antagonism, through transforming growth factor-beta neutralizing antibodies or losartan (an angiotensin II type 1 receptor antagonist), has been shown to prevent and possibly reverse aortic root dilatation, mitral valve prolapse, lung disease, and skeletal muscle dysfunction in a mouse model of Marfan syndrome. CONCLUSION: There are indicators that losartan, a drug widely used to treat arterial hypertension in humans, offers the first potential for primary prevention of clinical manifestations in Marfan syndrome.
Resumo:
AIMS: Currently available devices for transcatheter closure of patent foramen ovale (PFO) which rely on a permanent implant have limitations, including late complications. The study objective was to evaluate the safety, feasibility, and effectiveness of the PFx Closure System, the first transcatheter technique for PFO closure without an implantable device. METHODS AND RESULTS: A prospective study of 144 patients was conducted at nine clinical sites from October 2005 through August 2007. All patients had a history of cryptogenic stroke, transient ischemic attack, migraines, or decompression illness. The mean balloon stretched diameter of the PFO was 7.9 +/- 2.5 mm. Technical success (successful application of radiofrequency energy) was achieved in 130 patients. One patient required a transfusion as a result of blood loss during the procedure. There were no other major procedural complications. There were no recurrent strokes, deaths, conduction abnormalities, or perforations following the procedure. At a mean follow-up of 6 months, successful closure was achieved in 79 patients (55%). In PFOs with balloon sized or stretched diameters less than 8 mm, the closure rate was 72% (53/74). CONCLUSION: This study demonstrates that transcatheter closure of a PFO without a permanent implant is technically feasible and safe. Further technique and device modifications are required to achieve higher closure rates.
Resumo:
BACKGROUND: Marfan syndrome (MFS) is caused by mutations in the fibrillin-1 gene and dysregulation of transforming growth factor-beta (TGF-beta). Recent evidence suggests that losartan, an angiotensin II type 1 blocker that blunts TGF-beta activation, may be an effective treatment for MFS. We hypothesized that dysregulation of TGF-beta might be mirrored in circulating TGF-beta concentrations. METHODS AND RESULTS: Serum obtained from MFS mutant mice (Fbn1(C1039G/+)) treated with losartan was analyzed for circulating TGF-beta1 concentrations and compared with those from placebo-treated and wild-type mice. Aortic root size was measured by echocardiography. Data were validated in patients with MFS and healthy individuals. In mice, circulating total TGF-beta1 concentrations increased with age and were elevated in older untreated Fbn1(C1039G/+) mice compared with wild-type mice (P=0.01; n=16; mean+/-SEM, 115+/-8 ng/mL versus n=17; mean+/-SEM, 92+/-4 ng/mL). Losartan-treated Fbn1(C1039G/+) mice had lower total TGF-beta1 concentrations compared with age-matched Fbn1(C1039G/+) mice treated with placebo (P=0.01; n=18; 90+/-5 ng/mL), and circulating total TGF-beta1 levels were indistinguishable from those of age-matched wild-type mice (P=0.8). Correlation was observed between circulating TGF-beta1 levels and aortic root diameters in Fbn1(C1039G/+) and wild-type mice (P=0.002). In humans, circulating total TGF-beta1 concentrations were elevated in patients with MFS compared with control individuals (P<0.0001; n=53; 15+/-1.7 ng/mL versus n=74; 2.5+/-0.4 ng/mL). MFS patients treated with losartan (n=55) or beta-blocker (n=80) showed significantly lower total TGF-beta1 concentrations compared with untreated MFS patients (P< or =0.05). CONCLUSIONS: Circulating TGF-beta1 concentrations are elevated in MFS and decrease after administration of losartan, beta-blocker therapy, or both and therefore might serve as a prognostic and therapeutic marker in MFS.
Resumo:
BACKGROUND: Single-center reports have identified retrograde ascending aortic dissection (rAAD) as a potentially lethal complication of thoracic endovascular aortic repair (TEVAR). METHODS AND RESULTS: Between 1995 and 2008, 28 centers participating in the European Registry on Endovascular Aortic Repair Complications reported a total of 63 rAAD cases (incidence, 1.33%; 95% CI, 0.75 to 2.40). Eighty-one percent of patients underwent TEVAR for acute (n=26, 54%) or chronic type B dissection (n=13, 27%). Stent grafts with proximal bare springs were used in majority of patients (83%). Only 7 (15%) patients had intraoperative rAAD, with the remaining occurring during the index hospitalization (n=10, 21%) and during follow-up (n=31, 64%). Presenting symptoms included acute chest pain (n=16, 33%), syncope (n=12, 25%), and sudden death (n=9, 19%) whereas one fourth of patients were asymptomatic (n=12, 25%). Most patients underwent emergency (n=25) or elective (n=5) surgical repair. Outcome was fatal in 20 of 48 patients (42%). Causes of rAAD included the stent graft itself (60%), manipulation of guide wires/sheaths (15%), and progression of underlying aortic disease (15%). CONCLUSIONS: The incidence of rAAD was low (1.33%) in the present analysis with high mortality (42%). Patients undergoing TEVAR for type B dissection appeared to be most prone for the occurrence of rAAD. This complication occurred not only during the index hospitalization but after discharge up to 1050 days after TEVAR. Importantly, the majority of rAAD cases were associated with the use of proximal bare spring stent grafts with direct evidence of stent graft-induced injury at surgery or necropsy in half of the patients.
Resumo:
Das Theater inszeniert eine Spielwirklichkeit, die von der gesellschaftlichen Wirklichkeit nie ganz zu trennen ist - und, je nach Ansatz, auch nicht zu trennen sein soll. Die historischen und aktuellen Formen von Theater und ihre gesellschaftlichen Wechselwirkungen sind vielfältig und reichen von der Inszenierung der Gemeinschaft und Konstruktion nationaler Identität im Festspiel zu Krisenzeiten bis zu den Versuchsanordnungen, mit welchen Lukas Bärfuss gesellschaftliche Phänomene der Gegenwart in einem Kontext des postdramatischen Theaters freilegt. Zeigt sich bei Friedrich Dürrenmatt und Max Frisch ein je grundlegend divergierendes Verhältnis zur Möglichkeit einer Entwicklung zum Besseren, so zählt Rolf Hochhuth auf die aufklärerische Wirkung des Dokumentarischen. In den Theatertexten von Thomas Hürlimann spiegelt sich eine Schweizer Gesellschaft, die sich als Theater-Zuschauerin der sie umgebenden Katastrophen konstituiert. Die Spannung zwischen literarischem Text und Praxis des Regietheaters, wie sie für die Multimedialität des Theaters charakteristisch ist, zeigt sich exemplarisch in den Erfahrungen der Autorin Maja Beutler. Mit Texten von: Ursula Amrein, Lukas Bärfuss, Peter von Matt, Franziska Kolp, Elio Pellin, Rudolf Probst, Ursula Ruch, Peter Utz und Ulrich Weber
Resumo:
The accuracy of Global Positioning System (GPS) time series is degraded by the presence of offsets. To assess the effectiveness of methods that detect and remove these offsets, we designed and managed the Detection of Offsets in GPS Experiment. We simulated time series that mimicked realistic GPS data consisting of a velocity component, offsets, white and flicker noises (1/f spectrum noises) composed in an additive model. The data set was made available to the GPS analysis community without revealing the offsets, and several groups conducted blind tests with a range of detection approaches. The results show that, at present, manual methods (where offsets are hand picked) almost always give better results than automated or semi‒automated methods (two automated methods give quite similar velocity bias as the best manual solutions). For instance, the fifth percentile range (5% to 95%) in velocity bias for automated approaches is equal to 4.2 mm/year (most commonly ±0.4 mm/yr from the truth), whereas it is equal to 1.8 mm/yr for the manual solutions (most commonly 0.2 mm/yr from the truth). The magnitude of offsets detectable by manual solutions is smaller than for automated solutions, with the smallest detectable offset for the best manual and automatic solutions equal to 5 mm and 8 mm, respectively. Assuming the simulated time series noise levels are representative of real GPS time series, robust geophysical interpretation of individual site velocities lower than 0.2–0.4 mm/yr is therefore certainly not robust, although a limit of nearer 1 mm/yr would be a more conservative choice. Further work to improve offset detection in GPS coordinates time series is required before we can routinely interpret sub‒mm/yr velocities for single GPS stations.
Resumo:
Methane is an important greenhouse gas, responsible for about 20 of the warming induced by long-lived greenhouse gases since pre-industrial times. By reacting with hydroxyl radicals, methane reduces the oxidizing capacity of the atmosphere and generates ozone in the troposphere. Although most sources and sinks of methane have been identified, their relative contributions to atmospheric methane levels are highly uncertain. As such, the factors responsible for the observed stabilization of atmospheric methane levels in the early 2000s, and the renewed rise after 2006, remain unclear. Here, we construct decadal budgets for methane sources and sinks between 1980 and 2010, using a combination of atmospheric measurements and results from chemical transport models, ecosystem models, climate chemistry models and inventories of anthropogenic emissions. The resultant budgets suggest that data-driven approaches and ecosystem models overestimate total natural emissions. We build three contrasting emission scenarios � which differ in fossil fuel and microbial emissions � to explain the decadal variability in atmospheric methane levels detected, here and in previous studies, since 1985. Although uncertainties in emission trends do not allow definitive conclusions to be drawn, we show that the observed stabilization of methane levels between 1999 and 2006 can potentially be explained by decreasing-to-stable fossil fuel emissions, combined with stable-to-increasing microbial emissions. We show that a rise in natural wetland emissions and fossil fuel emissions probably accounts for the renewed increase in global methane levels after 2006, although the relative contribution of these two sources remains uncertain.
Resumo:
The past 1500 years provide a valuable opportunity to study the response of the climate system to external forcings. However, the integration of paleoclimate proxies with climate modeling is critical to improving the understanding of climate dynamics. In this paper, a climate system model and proxy records are therefore used to study the role of natural and anthropogenic forcings in driving the global climate. The inverse and forward approaches to paleoclimate data–model comparison are applied, and sources of uncertainty are identified and discussed. In the first of two case studies, the climate model simulations are compared with multiproxy temperature reconstructions. Robust solar and volcanic signals are detected in Southern Hemisphere temperatures, with a possible volcanic signal detected in the Northern Hemisphere. The anthropogenic signal dominates during the industrial period. It is also found that seasonal and geographical biases may cause multiproxy reconstructions to overestimate the magnitude of the long-term preindustrial cooling trend. In the second case study, the model simulations are compared with a coral δ18O record from the central Pacific Ocean. It is found that greenhouse gases, solar irradiance, and volcanic eruptions all influence the mean state of the central Pacific, but there is no evidence that natural or anthropogenic forcings have any systematic impact on El Niño–Southern Oscillation. The proxy climate relationship is found to change over time, challenging the assumption of stationarity that underlies the interpretation of paleoclimate proxies. These case studies demonstrate the value of paleoclimate data–model comparison but also highlight the limitations of current techniques and demonstrate the need to develop alternative approaches.
Resumo:
Ongoing changes in disturbance regimes are predicted to cause acute changes in ecosystem structure and function in the coming decades, but many aspects of these predictions are uncertain. A key challenge is to improve the predictability of postdisturbance biogeochemical trajectories at the ecosystem level. Ecosystem ecologists and paleoecologists have generated complementary data sets about disturbance (type, severity, frequency) and ecosystem response (net primary productivity, nutrient cycling) spanning decadal to millennial timescales. Here, we take the first steps toward a full integration of these data sets by reviewing how disturbances are reconstructed using dendrochronological and sedimentary archives and by summarizing the conceptual frameworks for carbon, nitrogen, and hydrologic responses to disturbances. Key research priorities include further development of paleoecological techniques that reconstruct both disturbances and terrestrial ecosystem dynamics. In addition, mechanistic detail from disturbance experiments, long-term observations, and chronosequences can help increase the understanding of ecosystem resilience.
Climate refugia: joint inference from fossil records, species distribution models and phylogeography
Resumo:
Climate refugia, locations where taxa survive periods of regionally adverse climate, are thought to be critical for maintaining biodiversity through the glacial–interglacial climate changes of the Quaternary. A critical research need is to better integrate and reconcile the three major lines of evidence used to infer the existence of past refugia – fossil records, species distribution models and phylogeographic surveys – in order to characterize the complex spatiotemporal trajectories of species and populations in and out of refugia. Here we review the complementary strengths, limitations and new advances for these three approaches. We provide case studies to illustrate their combined application, and point the way towards new opportunities for synthesizing these disparate lines of evidence. Case studies with European beech, Qinghai spruce and Douglas-fir illustrate how the combination of these three approaches successfully resolves complex species histories not attainable from any one approach. Promising new statistical techniques can capitalize on the strengths of each method and provide a robust quantitative reconstruction of species history. Studying past refugia can help identify contemporary refugia and clarify their conservation significance, in particular by elucidating the fine-scale processes and the particular geographic locations that buffer species against rapidly changing climate.
Resumo:
This paper is the maritime and sub–Antarctic contribution to the Scientific Committee for Antarctic Research (SCAR) Past Antarctic Ice Sheet Dynamics (PAIS) community Antarctic Ice Sheet reconstruction. The overarching aim for all sectors of Antarctica was to reconstruct the Last Glacial Maximum (LGM) ice sheet extent and thickness, and map the subsequent deglaciation in a series of 5000 year time slices. However, our review of the literature found surprisingly few high quality chronological constraints on changing glacier extents on these timescales in the maritime and sub–Antarctic sector. Therefore, in this paper we focus on an assessment of the terrestrial and offshore evidence for the LGM ice extent, establishing minimum ages for the onset of deglaciation, and separating evidence of deglaciation from LGM limits from those associated with later Holocene glacier fluctuations. Evidence included geomorphological descriptions of glacial landscapes, radiocarbon dated basal peat and lake sediment deposits, cosmogenic isotope ages of glacial features and molecular biological data. We propose a classification of the glacial history of the maritime and sub–Antarctic islands based on this assembled evidence. These include: (Type I) islands which accumulated little or no LGM ice; (Type II) islands with a limited LGM ice extent but evidence of extensive earlier continental shelf glaciations; (Type III) seamounts and volcanoes unlikely to have accumulated significant LGM ice cover; (Type IV) islands on shallow shelves with both terrestrial and submarine evidence of LGM (and/or earlier) ice expansion; (Type V) Islands north of the Antarctic Polar Front with terrestrial evidence of LGM ice expansion; and (Type VI) islands with no data. Finally, we review the climatological and geomorphological settings that separate the glaciological history of the islands within this classification scheme.