18 resultados para Multiple generation scenarios

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Molecular genetic testing is commonly used to confirm clinical diagnoses of inherited urea cycle disorders (UCDs); however, conventional mutation screenings encompassing only the coding regions of genes may not detect disease-causing mutations occurring in regulatory elements and introns. Microarray-based target enrichment and next-generation sequencing now allow more-comprehensive genetic screening. We applied this approach to UCDs and combined it with the use of DNA bar codes for more cost-effective, parallel analyses of multiple samples.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Atmospheric concentrations of the three important greenhouse gases (GHGs) CO2, CH4 and N2O are mediated by processes in the terrestrial biosphere that are sensitive to climate and CO2. This leads to feedbacks between climate and land and has contributed to the sharp rise in atmospheric GHG concentrations since pre-industrial times. Here, we apply a process-based model to reproduce the historical atmospheric N2O and CH4 budgets within their uncertainties and apply future scenarios for climate, land-use change and reactive nitrogen (Nr) inputs to investigate future GHG emissions and their feedbacks with climate in a consistent and comprehensive framework1. Results suggest that in a business-as-usual scenario, terrestrial N2O and CH4 emissions increase by 80 and 45%, respectively, and the land becomes a net source of C by AD 2100. N2O and CH4 feedbacks imply an additional warming of 0.4–0.5 °C by AD 2300; on top of 0.8–1.0 °C caused by terrestrial carbon cycle and Albedo feedbacks. The land biosphere represents an increasingly positive feedback to anthropogenic climate change and amplifies equilibrium climate sensitivity by 22–27%. Strong mitigation limits the increase of terrestrial GHG emissions and prevents the land biosphere from acting as an increasingly strong amplifier to anthropogenic climate change.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Results of a search for supersymmetry via direct production of third-generation squarks are reported, using 20.3  fb −1 of proton-proton collision data at √s =8  TeV recorded by the ATLAS experiment at the LHC in 2012. Two different analysis strategies based on monojetlike and c -tagged event selections are carried out to optimize the sensitivity for direct top squark-pair production in the decay channel to a charm quark and the lightest neutralino (t 1 →c+χ ˜ 0 1 ) across the top squark–neutralino mass parameter space. No excess above the Standard Model background expectation is observed. The results are interpreted in the context of direct pair production of top squarks and presented in terms of exclusion limits in the m ˜t 1, m ˜ X0 1 ) parameter space. A top squark of mass up to about 240 GeV is excluded at 95% confidence level for arbitrary neutralino masses, within the kinematic boundaries. Top squark masses up to 270 GeV are excluded for a neutralino mass of 200 GeV. In a scenario where the top squark and the lightest neutralino are nearly degenerate in mass, top squark masses up to 260 GeV are excluded. The results from the monojetlike analysis are also interpreted in terms of compressed scenarios for top squark-pair production in the decay channel t ˜ 1 →b+ff ′ +χ ˜ 0 1 and sbottom pair production with b ˜ 1 →b+χ ˜ 0 1 , leading to a similar exclusion for nearly mass-degenerate third-generation squarks and the lightest neutralino. The results in this paper significantly extend previous results at colliders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The precise timing of events in the brain has consequences for intracellular processes, synaptic plasticity, integration and network behaviour. Pyramidal neurons, the most widespread excitatory neuron of the neocortex have multiple spike initiation zones, which interact via dendritic and somatic spikes actively propagating in all directions within the dendritic tree. For these neurons, therefore, both the location and timing of synaptic inputs are critical. The time window for which the backpropagating action potential can influence dendritic spike generation has been extensively studied in layer 5 neocortical pyramidal neurons of rat somatosensory cortex. Here, we re-examine this coincidence detection window for pyramidal cell types across the rat somatosensory cortex in layers 2/3, 5 and 6. We find that the time-window for optimal interaction is widest and shifted in layer 5 pyramidal neurons relative to cells in layers 6 and 2/3. Inputs arriving at the same time and locations will therefore differentially affect spike-timing dependent processes in the different classes of pyramidal neurons.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the advent of high through-put sequencing (HTS), the emerging science of metagenomics is transforming our understanding of the relationships of microbial communities with their environments. While metagenomics aims to catalogue the genes present in a sample through assessing which genes are actively expressed, metatranscriptomics can provide a mechanistic understanding of community inter-relationships. To achieve these goals, several challenges need to be addressed from sample preparation to sequence processing, statistical analysis and functional annotation. Here we use an inbred non-obese diabetic (NOD) mouse model in which germ-free animals were colonized with a defined mixture of eight commensal bacteria, to explore methods of RNA extraction and to develop a pipeline for the generation and analysis of metatranscriptomic data. Applying the Illumina HTS platform, we sequenced 12 NOD cecal samples prepared using multiple RNA-extraction protocols. The absence of a complete set of reference genomes necessitated a peptide-based search strategy. Up to 16% of sequence reads could be matched to a known bacterial gene. Phylogenetic analysis of the mapped ORFs revealed a distribution consistent with ribosomal RNA, the majority from Bacteroides or Clostridium species. To place these HTS data within a systems context, we mapped the relative abundance of corresponding Escherichia coli homologs onto metabolic and protein-protein interaction networks. These maps identified bacterial processes with components that were well-represented in the datasets. In summary this study highlights the potential of exploiting the economy of HTS platforms for metatranscriptomics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CD4(+) T cells play a central role in the pathogenesis of multiple sclerosis (MS). Generation, activation and effector function of these cells crucially depends on their interaction with MHC II-peptide complexes displayed by antigen presenting cells (APC). Processing and presentation of self antigens by different APC therefore influences the disease course at all stages. Selection by thymic APC leads to the generation of autoreactive T cells, which can be activated by peripheral APC. Reactivation by central nervous system APC leads to the initiation of the inflammatory response resulting in demyelination. In this review we will focus on how MHC class II antigenic epitopes are created by different APC from the thymus, the periphery and from the brain, and will discuss the relevance of the balance between creation and destruction of such epitopes in the context of MS. A solid understanding of these processes offers the possibility for designing future therapeutic strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims: Early-generation drug-eluting stent (DES) overlap (OL) is associated with impaired long-term clinical outcomes whereas the impact of OL with newer-generation DES is unknown. Our aim was to assess the impact of OL on long-term clinical outcomes among patients treated with newer-generation DES. Methods and results: We analysed the three-year clinical outcomes of 3,133 patients included in a prospective DES registry according to stent type (sirolimus-eluting stents [SES; N=1,532] versus everolimus-eluting stents [EES; N=1,601]), and the presence or absence of OL. The primary outcome was a composite of death, myocardial infarction (MI), and target vessel revascularisation (TVR). The primary endpoint was more common in patients with OL (25.1%) than in those with multiple DES without OL (20.8%, adj HR=1.46, 95% CI: 1.03-2.09) and patients with a single DES (18.8%, adj HR=1.74, 95% CI: 1.34-2.25, p<0.001) at three years. A stratified analysis by stent type showed a higher risk of the primary outcome in SES with OL (28.7%) compared to other SES groups (without OL: 22.6%, p=0.04; single DES: 17.6%, p<0.001), but not between EES with OL (22.3%) and other EES groups (without OL: 18.5%, p=0.30; single DES: 20.4%, p=0.20). Conclusions: DES overlap is associated with impaired clinical outcomes during long-term follow-up. Compared with SES, EES provide similar clinical outcomes irrespective of DES overlap status.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate targets are designed to inform policies that would limit the magnitude and impacts of climate change caused by anthropogenic emissions of greenhouse gases and other substances. The target that is currently recognized by most world governments1 places a limit of two degrees Celsius on the global mean warming since preindustrial times. This would require large sustained reductions in carbon dioxide emissions during the twenty-first century and beyond2, 3, 4. Such a global temperature target, however, is not sufficient to control many other quantities, such as transient sea level rise5, ocean acidification6, 7 and net primary production on land8, 9. Here, using an Earth system model of intermediate complexity (EMIC) in an observation-informed Bayesian approach, we show that allowable carbon emissions are substantially reduced when multiple climate targets are set. We take into account uncertainties in physical and carbon cycle model parameters, radiative efficiencies10, climate sensitivity11 and carbon cycle feedbacks12, 13 along with a large set of observational constraints. Within this framework, we explore a broad range of economically feasible greenhouse gas scenarios from the integrated assessment community14, 15, 16, 17 to determine the likelihood of meeting a combination of specific global and regional targets under various assumptions. For any given likelihood of meeting a set of such targets, the allowable cumulative emissions are greatly reduced from those inferred from the temperature target alone. Therefore, temperature targets alone are unable to comprehensively limit the risks from anthropogenic emissions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Recently, Cipriani and colleagues examined the relative efficacy of 12 new-generation antidepressants on major depression using network meta-analytic methods. They found that some of these medications outperformed others in patient response to treatment. However, several methodological criticisms have been raised about network meta-analysis and Cipriani’s analysis in particular which creates the concern that the stated superiority of some antidepressants relative to others may be unwarranted. Materials and Methods: A Monte Carlo simulation was conducted which involved replicating Cipriani’s network metaanalysis under the null hypothesis (i.e., no true differences between antidepressants). The following simulation strategy was implemented: (1) 1000 simulations were generated under the null hypothesis (i.e., under the assumption that there were no differences among the 12 antidepressants), (2) each of the 1000 simulations were network meta-analyzed, and (3) the total number of false positive results from the network meta-analyses were calculated. Findings: Greater than 7 times out of 10, the network meta-analysis resulted in one or more comparisons that indicated the superiority of at least one antidepressant when no such true differences among them existed. Interpretation: Based on our simulation study, the results indicated that under identical conditions to those of the 117 RCTs with 236 treatment arms contained in Cipriani et al.’s meta-analysis, one or more false claims about the relative efficacy of antidepressants will be made over 70% of the time. As others have shown as well, there is little evidence in these trials that any antidepressant is more effective than another. The tendency of network meta-analyses to generate false positive results should be considered when conducting multiple comparison analyses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conservative medical treatment is commonly first recommended for patients with uncomplicated Type-B aortic dissection (AD). However, if dissection-related complications occur, endovascular repair or open surgery is performed. Here we establish computational models of AD based on radiological three-dimensional images of a patient at initial presentation and after 4-years of best medical treatment (BMT). Computational fluid dynamics analyses are performed to quantitatively investigate the hemodynamic features of AD. Entry and re-entries (functioning as entries and outlets) are identified in the initial and follow-up models, and obvious variations of the inter-luminal flow exchange are revealed. Computational studies indicate that the reduction of blood pressure in BMT patients lowers pressure and wall shear stress in the thoracic aorta in general, and flattens the pressure distribution on the outer wall of the dissection, potentially reducing the progressive enlargement of the false lumen. Finally, scenario studies of endovascular aortic repair are conducted. The results indicate that, for patients with multiple tears, stent-grafts occluding all re-entries would be required to effectively reduce inter-luminal blood communication and thus induce thrombosis in the false lumen. This implicates that computational flow analyses may identify entries and relevant re-entries between true and false lumen and potentially assist in stent-graft planning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract: Research on human values within the family focuses on value congruence between the family members (Knafo & Schwartz, 2004), based on the assumption that transmission of values is part of a child’s socialization process. Within the family, values are not only implicitly transmitted through this process but also explicitly conveyed through the educational goals of parents (Grusec et al., 2000; Knafo & Schwartz, 2003; 2004, 2009). However, there is a lack of empirical evidence on the role of family characteristics in the value transmission process, especially for families with young children. Thus, the study presented had multiple aims: Firstly, it analyzed the congruency between mothers’ and fathers’ values and their value-based educational goals. Secondly, it examined the influence of mothers’ and fathers’ socio-demographic characteristics on their educational goals. Thirdly, it analyzed the differences in parental educational goals in families with daughters and families with sons. Finally, it examined the congruency between children’s values and the value-based educational goals of their parents. The value transmission process within families with young children was analyzed using data from complete families (child, mother and father) in Switzerland (N = 265). The survey of children consisted of 139 boys and 126 girls aged between 7 and 9 years. Parents’ values and parental educational goals were assessed using the Portrait Value Questionnaire (PVQ-21) (Schwartz, 2005). Children’s’ values were assessed using the Picture-Based Value Survey for Children (PBVS-C) (Döring et al., 2010). Regarding the role of the family context in the process of shaping children’s values, the results of the study show that, on average, parents are similar not only with respect to their value profiles but also with regard to their notion as to which values they would like to transmit to their children. Our findings also suggest that children’s values at an early age are shaped more strongly by mothers’ values than by fathers’ values. Moreover, our results show differences in value transmission with respect to the child’s gender. In particular, they suggest that value transmission within the family has a greater influence on female than on male offspring.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biomarker research relies on tissue microarrays (TMA). TMAs are produced by repeated transfer of small tissue cores from a 'donor' block into a 'recipient' block and then used for a variety of biomarker applications. The construction of conventional TMAs is labor intensive, imprecise, and time-consuming. Here, a protocol using next-generation Tissue Microarrays (ngTMA) is outlined. ngTMA is based on TMA planning and design, digital pathology, and automated tissue microarraying. The protocol is illustrated using an example of 134 metastatic colorectal cancer patients. Histological, statistical and logistical aspects are considered, such as the tissue type, specific histological regions, and cell types for inclusion in the TMA, the number of tissue spots, sample size, statistical analysis, and number of TMA copies. Histological slides for each patient are scanned and uploaded onto a web-based digital platform. There, they are viewed and annotated (marked) using a 0.6-2.0 mm diameter tool, multiple times using various colors to distinguish tissue areas. Donor blocks and 12 'recipient' blocks are loaded into the instrument. Digital slides are retrieved and matched to donor block images. Repeated arraying of annotated regions is automatically performed resulting in an ngTMA. In this example, six ngTMAs are planned containing six different tissue types/histological zones. Two copies of the ngTMAs are desired. Three to four slides for each patient are scanned; 3 scan runs are necessary and performed overnight. All slides are annotated; different colors are used to represent the different tissues/zones, namely tumor center, invasion front, tumor/stroma, lymph node metastases, liver metastases, and normal tissue. 17 annotations/case are made; time for annotation is 2-3 min/case. 12 ngTMAs are produced containing 4,556 spots. Arraying time is 15-20 hr. Due to its precision, flexibility and speed, ngTMA is a powerful tool to further improve the quality of TMAs used in clinical and translational research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Previous studies of the sediments of Lake Lucerne have shown that massive subaqueous mass movements affecting unconsolidated sediments on lateral slopes are a common process in this lake, and, in view of historical reports describing damaging waves on the lake, it was suggested that tsunamis generated by mass movements represent a considerable natural hazard on the lakeshores. Newly performed numerical simulations combining two-dimensional, depth-averaged models for mass-movement propagation and for tsunami generation, propagation and inunda- tion reproduce a number of reported tsunami effects. Four analysed mass-movement scenarios—three based on documented slope failures involving volumes of 5.5 to 20.8 9 106 m3—show peak wave heights of several metres and maximum runup of 6 to [10 m in the directly affected basins, while effects in neighbouring basins are less drastic. The tsunamis cause large-scale inundation over distances of several hundred metres on flat alluvial plains close to the mass-movement source areas. Basins at the ends of the lake experience regular water-level oscillations with characteristic periods of several minutes. The vulnerability of potentially affected areas has increased dramatically since the times of the damaging historical events, recommending a thorough evaluation of the hazard.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Air was sampled from the porous firn layer at the NEEM site in Northern Greenland. We use an ensemble of ten reference tracers of known atmospheric history to characterise the transport properties of the site. By analysing uncertainties in both data and the reference gas atmospheric histories, we can objectively assign weights to each of the gases used for the depth-diffusivity reconstruction. We define an objective root mean square criterion that is minimised in the model tuning procedure. Each tracer constrains the firn profile differently through its unique atmospheric history and free air diffusivity, making our multiple-tracer characterisation method a clear improvement over the commonly used single-tracer tuning. Six firn air transport models are tuned to the NEEM site; all models successfully reproduce the data within a 1σ Gaussian distribution. A comparison between two replicate boreholes drilled 64 m apart shows differences in measured mixing ratio profiles that exceed the experimental error. We find evidence that diffusivity does not vanish completely in the lock-in zone, as is commonly assumed. The ice age- gas age difference (1 age) at the firn-ice transition is calculated to be 182+3−9 yr. We further present the first intercomparison study of firn air models, where we introduce diagnostic scenarios designed to probe specific aspects of the model physics. Our results show that there are major differences in the way the models handle advective transport. Furthermore, diffusive fractionation of isotopes in the firn is poorly constrained by the models, which has consequences for attempts to reconstruct the isotopic composition of trace gases back in time using firn air and ice core records.