833 resultados para Multi-model inference


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motivation: Modelling the 3D structures of proteins can often be enhanced if more than one fold template is used during the modelling process. However, in many cases, this may also result in poorer model quality for a given target or alignment method. There is a need for modelling protocols that can both consistently and significantly improve 3D models and provide an indication of when models might not benefit from the use of multiple target-template alignments. Here, we investigate the use of both global and local model quality prediction scores produced by ModFOLDclust2, to improve the selection of target-template alignments for the construction of multiple-template models. Additionally, we evaluate clustering the resulting population of multi- and single-template models for the improvement of our IntFOLD-TS tertiary structure prediction method. Results: We find that using accurate local model quality scores to guide alignment selection is the most consistent way to significantly improve models for each of the sequence to structure alignment methods tested. In addition, using accurate global model quality for re-ranking alignments, prior to selection, further improves the majority of multi-template modelling methods tested. Furthermore, subsequent clustering of the resulting population of multiple-template models significantly improves the quality of selected models compared with the previous version of our tertiary structure prediction method, IntFOLD-TS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was, within a sensitivity analysis framework, to determine if additional model complexity gives a better capability to model the hydrology and nitrogen dynamics of a small Mediterranean forested catchment or if the additional parameters cause over-fitting. Three nitrogen-models of varying hydrological complexity were considered. For each model, general sensitivity analysis (GSA) and Generalized Likelihood Uncertainty Estimation (GLUE) were applied, each based on 100,000 Monte Carlo simulations. The results highlighted the most complex structure as the most appropriate, providing the best representation of the non-linear patterns observed in the flow and streamwater nitrate concentrations between 1999 and 2002. Its 5% and 95% GLUE bounds, obtained considering a multi-objective approach, provide the narrowest band for streamwater nitrogen, which suggests increased model robustness, though all models exhibit periods of inconsistent good and poor fits between simulated outcomes and observed data. The results confirm the importance of the riparian zone in controlling the short-term (daily) streamwater nitrogen dynamics in this catchment but not the overall flux of nitrogen from the catchment. It was also shown that as the complexity of a hydrological model increases over-parameterisation occurs, but the converse is true for a water quality model where additional process representation leads to additional acceptable model simulations. Water quality data help constrain the hydrological representation in process-based models. Increased complexity was justifiable for modelling river-system hydrochemistry. Increased complexity was justifiable for modelling river-system hydrochemistry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A detailed analysis is undertaken of the Atlantic-European climate using data from 500-year-long proxy-based climate reconstructions, a long climate simulation with perpetual 1990 forcing, as well as two global and one regional climate change scenarios. The observed and simulated interannual variability and teleconnectivity are compared and interpreted in order to improve the understanding of natural climate variability on interannual to decadal time scales for the late Holocene. The focus is set on the Atlantic-European and Alpine regions during the winter and summer seasons, using temperature, precipitation, and 500 hPa geopotential height fields. The climate reconstruction shows pronounced interdecadal variations that appear to “lock” the atmospheric circulation in quasi-steady long-term patterns over multi-decadal periods controlling at least part of the temperature and precipitation variability. Different circulation patterns are persistent over several decades for the period 1500 to 1900. The 500-year-long simulation with perpetual 1990 forcing shows some substantial differences, with a more unsteady teleconnectivity behaviour. Two global scenario simulations indicate a transition towards more stable teleconnectivity for the next 100 years. Time series of reconstructed and simulated temperature and precipitation over the Alpine region show comparatively small changes in interannual variability within the time frame considered, with the exception of the summer season, where a substantial increase in interannual variability is simulated by regional climate models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The observed dramatic decrease in September sea ice extent (SIE) has been widely discussed in the scientific literature. Though there is qualitative agreement between observations and ensemble members of the Third Coupled Model Intercomparison Project (CMIP3), it is concerning that the observed trend (1979–2010) is not captured by any ensemble member. The potential sources of this discrepancy include: observational uncertainty, physical model limitations and vigorous natural climate variability. The latter has received less attention and is difficult to assess using the relatively short observational sea ice records. In this study multi-centennial pre-industrial control simulations with five CMIP3 climate models are used to investigate the role that the Arctic oscillation (AO), the Atlantic multi-decadal oscillation (AMO) and the Atlantic meridional overturning circulation (AMOC) play in decadal sea ice variability. Further, we use the models to determine the impact that these sources of variability have had on SIE over both the era of satellite observation (1979–2010) and an extended observational record (1953–2010). There is little evidence of a relationship between the AO and SIE in the models. However, we find that both the AMO and AMOC indices are significantly correlated with SIE in all the models considered. Using sensitivity statistics derived from the models, assuming a linear relationship, we attribute 0.5–3.1%/decade of the 10.1%/decade decline in September SIE (1979–2010) to AMO driven variability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Acrylamide is formed from reducing sugars and asparagine during the preparation of French fries. The commercial preparation of French fries is a multi-stage process involving the preparation of frozen, par-fried potato strips for distribution to catering outlets where they are finish fried. The initial blanching, treatment in glucose solution and par-frying steps are crucial since they determine the levels of precursors present at the beginning of the finish frying process. In order to minimize the quantities of acrylamide in cooked fries, it is important to understand the impact of each stage on the formation of acrylamide. Acrylamide, amino acids, sugars, moisture, fat and color were monitored at time intervals during the frying of potato strips which had been dipped in varying concentrations of glucose and fructose during a typical pretreatment. A mathematical model of the finish-frying was developed based on the fundamental chemical reaction pathways, incorporating moisture and temperature gradients in the fries. This showed the contribution of both glucose and fructose to the generation of acrylamide, and accurately predicted the acrylamide content of the final fries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical methods of inference typically require the likelihood function to be computable in a reasonable amount of time. The class of “likelihood-free” methods termed Approximate Bayesian Computation (ABC) is able to eliminate this requirement, replacing the evaluation of the likelihood with simulation from it. Likelihood-free methods have gained in efficiency and popularity in the past few years, following their integration with Markov Chain Monte Carlo (MCMC) and Sequential Monte Carlo (SMC) in order to better explore the parameter space. They have been applied primarily to estimating the parameters of a given model, but can also be used to compare models. Here we present novel likelihood-free approaches to model comparison, based upon the independent estimation of the evidence of each model under study. Key advantages of these approaches over previous techniques are that they allow the exploitation of MCMC or SMC algorithms for exploring the parameter space, and that they do not require a sampler able to mix between models. We validate the proposed methods using a simple exponential family problem before providing a realistic problem from human population genetics: the comparison of different demographic models based upon genetic data from the Y chromosome.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing use of drug combinations to treat disease states, such as cancer, calls for improved delivery systems that are able to deliver multiple agents. Herein, we report a series of novel Janus dendrimers with potential for use in combination therapy. Different generations (first and second) of PEG-based dendrons containing two different “model drugs”, benzyl alcohol (BA) and 3-phenylpropionic acid (PPA), were synthesized. BA and PPA were attached via two different linkers (carbonate and ester, respectively) to promote differential drug release. The four dendrons were coupled together via (3 + 2) cycloaddition chemistries to afford four Janus dendrimers, which contained varying amounts and different ratios of BA and PPA, namely, (BA)2-G1-G1-(PPA)2, (BA)4-G2-G1-(PPA)2, (BA)2-G1-G2-(PPA)4, and (BA)4-G2-G2-(PPA)4. Release studies in plasma showed that the dendrimers provided sequential release of the two model drugs, with BA being released faster than PPA from all of the dendrons. The different dendrimers allowed delivery of increasing amounts (0.15–0.30 mM) and in exact molecular ratios (1:2; 2:1; 1:2; 2:2) of the two model drug compounds. The dendrimers were noncytotoxic (100% viability at 1 mg/mL) toward human umbilical vein endothelial cells (HUVEC) and nontoxic toward red blood cells, as confirmed by hemolysis studies. These studies demonstrate that these Janus PEG-based dendrimers offer great potential for the delivery of drugs via combination therapy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A very high resolution atmospheric general circulation model, T106-L19, has been used for the simulation of hurricanes in a multi-year numerical experiment. Individual storms as well as their geographical and seasonal distribution agree remarkably well with observations. In spite of the fact that only the thermal and dynamical structure of the storms have been used as criteria of their identification, practically all of them occur in areas where the sea surface temperature is higher or equal to 26 °C. There are considerable variations from year to year in the number of storms in spite of the fact that there are no interannual variations in the SST pattern. It is found that the number of storms in particular areas appear to depend on the intensity of the Hadley-Walker cell. The result is clearly resolution-dependant. At lower horizonal resolution, T42, for example, the intensity of the storms is significantly reduced and their overall structure is less realistic, including their vertical form and extent.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter presents techniques used for the generation of 3D digital elevation models (DEMs) from remotely sensed data. Three methods are explored and discussed—optical stereoscopic imagery, Interferometric Synthetic Aperture Radar (InSAR), and LIght Detection and Ranging (LIDAR). For each approach, the state-of-the-art presented in the literature is reviewed. Techniques involved in DEM generation are presented with accuracy evaluation. Results of DEMs reconstructed from remotely sensed data are illustrated. While the processes of DEM generation from satellite stereoscopic imagery represents a good example of passive, multi-view imaging technology, discussed in Chap. 2 of this book, InSAR and LIDAR use different principles to acquire 3D information. With regard to InSAR and LIDAR, detailed discussions are conducted in order to convey the fundamentals of both technologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work proposes a unified neurofuzzy modelling scheme. To begin with, the initial fuzzy base construction method is based on fuzzy clustering utilising a Gaussian mixture model (GMM) combined with the analysis of covariance (ANOVA) decomposition in order to obtain more compact univariate and bivariate membership functions over the subspaces of the input features. The mean and covariance of the Gaussian membership functions are found by the expectation maximisation (EM) algorithm with the merit of revealing the underlying density distribution of system inputs. The resultant set of membership functions forms the basis of the generalised fuzzy model (GFM) inference engine. The model structure and parameters of this neurofuzzy model are identified via the supervised subspace orthogonal least square (OLS) learning. Finally, instead of providing deterministic class label as model output by convention, a logistic regression model is applied to present the classifier’s output, in which the sigmoid type of logistic transfer function scales the outputs of the neurofuzzy model to the class probability. Experimental validation results are presented to demonstrate the effectiveness of the proposed neurofuzzy modelling scheme.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analysis of single forcing runs from CMIP5 (the fifth Coupled Model Intercomparison Project) simulations shows that the mid-twentieth century temperature hiatus, and the coincident decrease in precipitation, is likely to have been influenced strongly by anthropogenic aerosol forcing. Models that include a representation of the indirect effect of aerosol better reproduce inter-decadal variability in historical global-mean near-surface temperatures, particularly the cooling in the 1950s and 1960s, compared to models with representation of the aerosol direct effect only. Models with the indirect effect also show a more pronounced decrease in precipitation during this period, which is in better agreement with observations, and greater inter-decadal variability in the inter-hemispheric temperature difference. This study demonstrates the importance of representing aerosols, and their indirect effects, in general circulation models, and suggests that inter-model diversity in aerosol burden and representation of aerosol–cloud interaction can produce substantial variation in simulations of climate variability on multi decadal timescales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study puts forward a method to model and simulate the complex system of hospital on the basis of multi-agent technology. The formation of the agents of hospitals with intelligent and coordinative characteristics was designed, the message object was defined, and the model operating mechanism of autonomous activities and coordination mechanism was also designed. In addition, the Ontology library and Norm library etc. were introduced using semiotic method and theory, to enlarge the method of system modelling. Swarm was used to develop the multi-agent based simulation system, which is favorable for making guidelines for hospital's improving it's organization and management, optimizing the working procedure, improving the quality of medical care as well as reducing medical charge costs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays the changing environment becomes the main challenge for most of organizations, since they have to evaluate proper policies to adapt to the environment. In this paper, we propose a multi-agent simulation method to evaluate policies based on complex adaptive system theory. Furthermore, we propose a semiotic EDA (Epistemic, Deontic, Axiological) agent model to simulate agent's behavior in the system by incorporating the social norms reflecting the policy. A case study is also provided to validate our approach. Our research present better adaptability and validity than the qualitative analysis and experiment approach and the semiotic agent model provides high creditability to simulate agents' behavior.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The link between the Pacific/North American pattern (PNA) and the North Atlantic Oscillation (NAO) is investigated in reanalysis data (NCEP, ERA40) and multi-century CGCM runs for present day climate using three versions of the ECHAM model. PNA and NAO patterns and indices are determined via rotated principal component analysis on monthly mean 500 hPa geopotential height fields using the varimax criteria. On average, the multi-century CGCM simulations show a significant anti-correlation between PNA and NAO. Further, multi-decadal periods with significantly enhanced (high anti-correlation, active phase) or weakened (low correlations, inactive phase) coupling are found in all CGCMs. In the simulated active phases, the storm track activity near Newfoundland has a stronger link with the PNA variability than during the inactive phases. On average, the reanalysis datasets show no significant anti-correlation between PNA and NAO indices, but during the sub-period 1973–1994 a significant anti-correlation is detected, suggesting that the present climate could correspond to an inactive period as detected in the CGCMs. An analysis of possible physical mechanisms suggests that the link between the patterns is established by the baroclinic waves forming the North Atlantic storm track. The geopotential height anomalies associated with negative PNA phases induce an increased advection of warm and moist air from the Gulf of Mexico and cold air from Canada. Both types of advection contribute to increase baroclinicity over eastern North America and also to increase the low level latent heat content of the warm air masses. Thus, growth conditions for eddies at the entrance of the North Atlantic storm track are enhanced. Considering the average temporal development during winter for the CGCM, results show an enhanced Newfoundland storm track maximum in the early winter for negative PNA, followed by a downstream enhancement of the Atlantic storm track in the subsequent months. In active (passive) phases, this seasonal development is enhanced (suppressed). As the storm track over the central and eastern Atlantic is closely related to the NAO variability, this development can be explained by the shift of the NAO index to more positive values.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of Bayesian inference in the inference of time-frequency representations has, thus far, been limited to offline analysis of signals, using a smoothing spline based model of the time-frequency plane. In this paper we introduce a new framework that allows the routine use of Bayesian inference for online estimation of the time-varying spectral density of a locally stationary Gaussian process. The core of our approach is the use of a likelihood inspired by a local Whittle approximation. This choice, along with the use of a recursive algorithm for non-parametric estimation of the local spectral density, permits the use of a particle filter for estimating the time-varying spectral density online. We provide demonstrations of the algorithm through tracking chirps and the analysis of musical data.