46 resultados para Multi-sector models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A multi-model analysis of Atlantic multidecadal variability is performed with the following aims: to investigate the similarities to observations; to assess the strength and relative importance of the different elements of the mechanism proposed by Delworth et al. (J Clim 6:1993–2011, 1993) (hereafter D93) among coupled general circulation models (CGCMs); and to relate model differences to mean systematic error. The analysis is performed with long control simulations from ten CGCMs, with lengths ranging between 500 and 3600 years. In most models the variations of sea surface temperature (SST) averaged over North Atlantic show considerable power on multidecadal time scales, but with different periodicity. The SST variations are largest in the mid-latitude region, consistent with the short instrumental record. Despite large differences in model configurations, we find quite some consistency among the models in terms of processes. In eight of the ten models the mid-latitude SST variations are significantly correlated with fluctuations in the Atlantic meridional overturning circulation (AMOC), suggesting a link to northward heat transport changes. Consistent with this link, the three models with the weakest AMOC have the largest cold SST bias in the North Atlantic. There is no linear relationship on decadal timescales between AMOC and North Atlantic Oscillation in the models. Analysis of the key elements of the D93 mechanisms revealed the following: Most models present strong evidence that high-latitude winter mixing precede AMOC changes. However, the regions of wintertime convection differ among models. In most models salinity-induced density anomalies in the convective region tend to lead AMOC, while temperature-induced density anomalies lead AMOC only in one model. However, analysis shows that salinity may play an overly important role in most models, because of cold temperature biases in their relevant convective regions. In most models subpolar gyre variations tend to lead AMOC changes, and this relation is strong in more than half of the models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND High-risk prostate cancer (PCa) is an extremely heterogeneous disease. A clear definition of prognostic subgroups is mandatory. OBJECTIVE To develop a pretreatment prognostic model for PCa-specific survival (PCSS) in high-risk PCa based on combinations of unfavorable risk factors. DESIGN, SETTING, AND PARTICIPANTS We conducted a retrospective multicenter cohort study including 1360 consecutive patients with high-risk PCa treated at eight European high-volume centers. INTERVENTION Retropubic radical prostatectomy with pelvic lymphadenectomy. OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS Two Cox multivariable regression models were constructed to predict PCSS as a function of dichotomization of clinical stage (< cT3 vs cT3-4), Gleason score (GS) (2-7 vs 8-10), and prostate-specific antigen (PSA; ≤ 20 ng/ml vs > 20 ng/ml). The first "extended" model includes all seven possible combinations; the second "simplified" model includes three subgroups: a good prognosis subgroup (one single high-risk factor); an intermediate prognosis subgroup (PSA >20 ng/ml and stage cT3-4); and a poor prognosis subgroup (GS 8-10 in combination with at least one other high-risk factor). The predictive accuracy of the models was summarized and compared. Survival estimates and clinical and pathologic outcomes were compared between the three subgroups. RESULTS AND LIMITATIONS The simplified model yielded an R(2) of 33% with a 5-yr area under the curve (AUC) of 0.70 with no significant loss of predictive accuracy compared with the extended model (R(2): 34%; AUC: 0.71). The 5- and 10-yr PCSS rates were 98.7% and 95.4%, 96.5% and 88.3%, 88.8% and 79.7%, for the good, intermediate, and poor prognosis subgroups, respectively (p = 0.0003). Overall survival, clinical progression-free survival, and histopathologic outcomes significantly worsened in a stepwise fashion from the good to the poor prognosis subgroups. Limitations of the study are the retrospective design and the long study period. CONCLUSIONS This study presents an intuitive and easy-to-use stratification of high-risk PCa into three prognostic subgroups. The model is useful for counseling and decision making in the pretreatment setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In these proceedings we review the flavour phenomenology of 2HDMs with generic Yukawa structures [1]. We first consider the quark sector and find that despite the stringent constraints from FCNC processes large effects in tauonic B decays are still possible. We then consider lepton flavour observables, show correlations between m →eg and m− →e−e+e− in the 2HDM of type III and give upper bounds on the lepton flavour violating B decay Bd →me.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software corpora facilitate reproducibility of analyses, however, static analysis for an entire corpus still requires considerable effort, often duplicated unnecessarily by multiple users. Moreover, most corpora are designed for single languages increasing the effort for cross-language analysis. To address these aspects we propose Pangea, an infrastructure allowing fast development of static analyses on multi-language corpora. Pangea uses language-independent meta-models stored as object model snapshots that can be directly loaded into memory and queried without any parsing overhead. To reduce the effort of performing static analyses, Pangea provides out-of-the box support for: creating and refining analyses in a dedicated environment, deploying an analysis on an entire corpus, using a runner that supports parallel execution, and exporting results in various formats. In this tool demonstration we introduce Pangea and provide several usage scenarios that illustrate how it reduces the cost of analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Correct predictions of future blood glucose levels in individuals with Type 1 Diabetes (T1D) can be used to provide early warning of upcoming hypo-/hyperglycemic events and thus to improve the patient's safety. To increase prediction accuracy and efficiency, various approaches have been proposed which combine multiple predictors to produce superior results compared to single predictors. Three methods for model fusion are presented and comparatively assessed. Data from 23 T1D subjects under sensor-augmented pump (SAP) therapy were used in two adaptive data-driven models (an autoregressive model with output correction - cARX, and a recurrent neural network - RNN). Data fusion techniques based on i) Dempster-Shafer Evidential Theory (DST), ii) Genetic Algorithms (GA), and iii) Genetic Programming (GP) were used to merge the complimentary performances of the prediction models. The fused output is used in a warning algorithm to issue alarms of upcoming hypo-/hyperglycemic events. The fusion schemes showed improved performance with lower root mean square errors, lower time lags, and higher correlation. In the warning algorithm, median daily false alarms (DFA) of 0.25%, and 100% correct alarms (CA) were obtained for both event types. The detection times (DT) before occurrence of events were 13.0 and 12.1 min respectively for hypo-/hyperglycemic events. Compared to the cARX and RNN models, and a linear fusion of the two, the proposed fusion schemes represents a significant improvement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ATLAS measurements of the azimuthal anisotropy in lead–lead collisions at √sNN = 2.76 TeV are shown using a dataset of approximately 7μb−1 collected at the LHC in 2010. The measurements are performed for charged particles with transversemomenta 0.5 < pT < 20 GeV and in the pseudorapidity range |η| < 2.5. The anisotropy is characterized by the Fourier coefficients, vn, of the charged-particle azimuthal angle distribution for n = 2–4. The Fourier coefficients are evaluated using multi-particle cumulants calculated with the generating function method. Results on the transverse momentum, pseudorapidity and centrality dependence of the vn coefficients are presented. The elliptic flow, v2, is obtained from the two-, four-, six- and eight-particle cumulants while higher-order coefficients, v3 and v4, are determined with two- and four-particle cumulants. Flow harmonics vn measured with four-particle cumulants are significantly reduced compared to the measurement involving two-particle cumulants. A comparison to vn measurements obtained using different analysis methods and previously reported by the LHC experiments is also shown. Results of measurements of flow fluctuations evaluated with multiparticle cumulants are shown as a function of transverse momentum and the collision centrality. Models of the initial spatial geometry and its fluctuations fail to describe the flow fluctuations measurements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The nematode Caenorhabditis elegans is a well-known model organism used to investigate fundamental questions in biology. Motility assays of this small roundworm are designed to study the relationships between genes and behavior. Commonly, motility analysis is used to classify nematode movements and characterize them quantitatively. Over the past years, C. elegans' motility has been studied across a wide range of environments, including crawling on substrates, swimming in fluids, and locomoting through microfluidic substrates. However, each environment often requires customized image processing tools relying on heuristic parameter tuning. In the present study, we propose a novel Multi-Environment Model Estimation (MEME) framework for automated image segmentation that is versatile across various environments. The MEME platform is constructed around the concept of Mixture of Gaussian (MOG) models, where statistical models for both the background environment and the nematode appearance are explicitly learned and used to accurately segment a target nematode. Our method is designed to simplify the burden often imposed on users; here, only a single image which includes a nematode in its environment must be provided for model learning. In addition, our platform enables the extraction of nematode ‘skeletons’ for straightforward motility quantification. We test our algorithm on various locomotive environments and compare performances with an intensity-based thresholding method. Overall, MEME outperforms the threshold-based approach for the overwhelming majority of cases examined. Ultimately, MEME provides researchers with an attractive platform for C. elegans' segmentation and ‘skeletonizing’ across a wide range of motility assays.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE The impact of cardiopulmonary bypass in level III-IV tumor thrombectomy on surgical and oncologic outcomes is unknown. We determine the impact of cardiopulmonary bypass on overall and cancer specific survival, as well as surgical complication rates and immediate outcomes in patients undergoing nephrectomy and level III-IV tumor thrombectomy with or without cardiopulmonary bypass. MATERIALS AND METHODS We retrospectively analyzed 362 patients with renal cell cancer and with level III or IV tumor thrombus from 1992 to 2012 at 22 U.S. and European centers. Cox proportional hazards models were used to compare overall and cancer specific survival between patients with and without cardiopulmonary bypass. Perioperative mortality and complication rates were assessed using logistic regression analyses. RESULTS Median overall survival was 24.6 months in noncardiopulmonary bypass cases and 26.6 months in cardiopulmonary bypass cases. Overall survival and cancer specific survival did not differ significantly in both groups on univariate analysis or when adjusting for known risk factors. On multivariate analysis no significant differences were seen in hospital length of stay, Clavien 1-4 complication rate, intraoperative or 30-day mortality and cancer specific survival. Limitations include the retrospective nature of the study. CONCLUSIONS In our multi-institutional analysis the use of cardiopulmonary bypass did not significantly impact cancer specific survival or overall survival in patients undergoing nephrectomy and level III or IV tumor thrombectomy. Neither approach was independently associated with increased mortality on multivariate analysis. Greater surgical complications were not independently associated with the use of cardiopulmonary bypass.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large uncertainties exist concerning the impact of Greenland ice sheet melting on the Atlantic meridional overturning circulation (AMOC) in the future, partly due to different sensitivity of the AMOC to freshwater input in the North Atlantic among climate models. Here we analyse five projections from different coupled ocean–atmosphere models with an additional 0.1 Sv (1 Sv = 10 6 m3/s) of freshwater released around Greenland between 2050 and 2089. We find on average a further weakening of the AMOC at 26°N of 1.1 ± 0.6 Sv representing a 27 ± 14% supplementary weakening in 2080–2089, as compared to the weakening relative to 2006–2015 due to the effect of the external forcing only. This weakening is lower than what has been found with the same ensemble of models in an identical experimen - tal set-up but under recent historical climate conditions. This lower sensitivity in a warmer world is explained by two main factors. First, a tendency of decoupling is detected between the surface and the deep ocean caused by an increased thermal stratification in the North Atlantic under the effect of global warming. This induces a shoaling of ocean deep ventilation through convection hence ventilating only intermediate levels. The second important effect concerns the so-called Canary Current freshwater leakage; a process by which additionally released fresh water in the North Atlantic leaks along the Canary Current and escapes the convection zones towards the subtropical area. This leakage is increasing in a warming climate, which is a consequence of decreasing gyres asymmetry due to changes in Ekman rumping. We suggest that these modifications are related with the northward shift of the jet stream in a warmer world. For these two reasons the AMOC is less susceptible to freshwater perturbations (near the deep water formation sides) in the North Atlantic as compared to the recent historical climate conditions. Finally, we propose a bilinear model that accounts for the two former processes to give a conceptual explanation about the decreasing AMOC sensitivity due to freshwater input. Within the limit of this bilinear model, we find that 62 ± 8% of the reduction in sensitivity is related with the changes in gyre asymmetry and freshwater leakage and 38 ± 8% is due to the reduction in deep ocean ventilation associated with the increased stratification in the North Atlantic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Snow avalanches pose a threat to settlements and infrastructure in alpine environments. Due to the catastrophic events in recent years, the public is more aware of this phenomenon. Alpine settlements have always been confronted with natural hazards, but changes in land use and in dealing with avalanche hazards lead to an altering perception of this threat. In this study, a multi-temporal risk assessment is presented for three avalanche tracks in the municipality of Galtür, Austria. Changes in avalanche risk as well as changes in the risk-influencing factors (process behaviour, values at risk (buildings) and vulnerability) between 1950 and 2000 are quantified. An additional focus is put on the interconnection between these factors and their influence on the resulting risk. The avalanche processes were calculated using different simulation models (SAMOS as well as ELBA+). For each avalanche track, different scenarios were calculated according to the development of mitigation measures. The focus of the study was on a multi-temporal risk assessment; consequently the used models could be replaced with other snow avalanche models providing the same functionalities. The monetary values of buildings were estimated using the volume of the buildings and average prices per cubic meter. The changing size of the buildings over time was inferred from construction plans. The vulnerability of the buildings is understood as a degree of loss to a given element within the area affected by natural hazards. A vulnerability function for different construction types of buildings that depends on avalanche pressure was used to assess the degree of loss. No general risk trend could be determined for the studied avalanche tracks. Due to the high complexity of the variations in risk, small changes of one of several influencing factors can cause considerable differences in the resulting risk. This multi-temporal approach leads to better understanding of the today's risk by identifying the main changes and the underlying processes. Furthermore, this knowledge can be implemented in strategies for sustainable development in Alpine settlements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We used multiple sets of simulations both at the atomistic and coarse-grained level of resolution to investigate interaction and binding of α-tochoperol transfer protein (α-TTP) to phosphatidylinositol phosphate lipids (PIPs). Our calculations indicate that enrichment of membranes with such lipids facilitate membrane anchoring. Atomistic models suggest that PIP can be incorporated into the binding cavity of α-TTP and therefore confirm that such protein can work as lipid exchanger between the endosome and the plasma membrane. Comparison of the atomistic models of the α-TTP-PIPs complex with membrane-bound α-TTP revealed different roles for the various basic residues composing the basic patch that is key for the protein/ligand interaction. Such residues are of critical importance as several point mutations at their position lead to severe forms of ataxia with vitamin E deficiency (AVED) phenotypes. Specifically, R221 is main residue responsible for the stabilization of the complex. R68 and R192 exchange strong interactions in the protein or in the membrane complex only, suggesting that the two residues alternate contact formation, thus facilitating lipid flipping from the membrane into the protein cavity during the lipid exchange process. Finally, R59 shows weaker interactions with PIPs anyway with a clear preference for specific phosphorylation positions, hinting a role in early membrane selectivity for the protein. Altogether, our simulations reveal significant aspects at the atomistic scale of interactions of α-TTP with the plasma membrane and with PIP, providing clarifications on the mechanism of intracellular vitamin E trafficking and helping establishing the role of key residue for the functionality of α-TTP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study compares gridded European seasonal series of surface air temperature (SAT) and precipitation (PRE) reconstructions with a regional climate simulation over the period 1500–1990. The area is analysed separately for nine subareas that represent the majority of the climate diversity in the European sector. In their spatial structure, an overall good agreement is found between the reconstructed and simulated climate features across Europe, supporting consistency in both products. Systematic biases between both data sets can be explained by a priori known deficiencies in the simulation. Simulations and reconstructions, however, largely differ in the temporal evolution of past climate for European subregions. In particular, the simulated anomalies during the Maunder and Dalton minima show stronger response to changes in the external forcings than recorded in the reconstructions. Although this disagreement is to some extent expected given the prominent role of internal variability in the evolution of regional temperature and precipitation, a certain degree of agreement is a priori expected in variables directly affected by external forcings. In this sense, the inability of the model to reproduce a warm period similar to that recorded for the winters during the first decades of the 18th century in the reconstructions is indicative of fundamental limitations in the simulation that preclude reproducing exceptionally anomalous conditions. Despite these limitations, the simulated climate is a physically consistent data set, which can be used as a benchmark to analyse the consistency and limitations of gridded reconstructions of different variables. A comparison of the leading modes of SAT and PRE variability indicates that reconstructions are too simplistic, especially for precipitation, which is associated with the linear statistical techniques used to generate the reconstructions. The analysis of the co-variability between sea level pressure (SLP) and SAT and PRE in the simulation yields a result which resembles the canonical co-variability recorded in the observations for the 20th century. However, the same analysis for reconstructions exhibits anomalously low correlations, which points towards a lack of dynamical consistency between independent reconstructions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Point Distribution Models (PDM) are among the most popular shape description techniques and their usefulness has been demonstrated in a wide variety of medical imaging applications. However, to adequately characterize the underlying modeled population it is essential to have a representative number of training samples, which is not always possible. This problem is especially relevant as the complexity of the modeled structure increases, being the modeling of ensembles of multiple 3D organs one of the most challenging cases. In this paper, we introduce a new GEneralized Multi-resolution PDM (GEM-PDM) in the context of multi-organ analysis able to efficiently characterize the different inter-object relations, as well as the particular locality of each object separately. Importantly, unlike previous approaches, the configuration of the algorithm is automated thanks to a new agglomerative landmark clustering method proposed here, which equally allows us to identify smaller anatomically significant regions within organs. The significant advantage of the GEM-PDM method over two previous approaches (PDM and hierarchical PDM) in terms of shape modeling accuracy and robustness to noise, has been successfully verified for two different databases of sets of multiple organs: six subcortical brain structures, and seven abdominal organs. Finally, we propose the integration of the new shape modeling framework into an active shape-model-based segmentation algorithm. The resulting algorithm, named GEMA, provides a better overall performance than the two classical approaches tested, ASM, and hierarchical ASM, when applied to the segmentation of 3D brain MRI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present studies of 9 modern (up to 400-yr-old) peat sections from Slovenia, Switzerland, Austria, Italy, and Finland. Precise radiocarbon dating of modern samples is possible due to the large bomb peak of atmospheric 14C concentration in 1963 and the following rapid decline in the 14C level. All the analyzed 14C profiles appeared concordant with the shape of the bomb peak of atmospheric 14C concentration, integrated over some time interval with a length specific to the peat section. In the peat layers covered by the bomb peak, calendar ages of individual peat samples could be determined almost immediately, with an accuracy of 23 yr. In the pre-bomb sections, the calendar ages of individual dated samples are determined in the form of multi-modal probability distributions of about 300 yr wide (about AD 16501950). However, simultaneous use of the post-bomb and pre-bomb 14C dates, and lithological information, enabled the rejection of most modes of probability distributions in the pre-bomb section. In effect, precise age-depth models of the post-bomb sections have been extended back in time, into the wiggly part of the 14C calibration curve.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a novel surrogate model-based global optimization framework allowing a large number of function evaluations. The method, called SpLEGO, is based on a multi-scale expected improvement (EI) framework relying on both sparse and local Gaussian process (GP) models. First, a bi-objective approach relying on a global sparse GP model is used to determine potential next sampling regions. Local GP models are then constructed within each selected region. The method subsequently employs the standard expected improvement criterion to deal with the exploration-exploitation trade-off within selected local models, leading to a decision on where to perform the next function evaluation(s). The potential of our approach is demonstrated using the so-called Sparse Pseudo-input GP as a global model. The algorithm is tested on four benchmark problems, whose number of starting points ranges from 102 to 104. Our results show that SpLEGO is effective and capable of solving problems with large number of starting points, and it even provides significant advantages when compared with state-of-the-art EI algorithms.