982 resultados para robust atomic distributed amorphous


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Global hydrological models (GHMs) model the land surface hydrologic dynamics of continental-scale river basins. Here we describe one such GHM, the Macro-scale - Probability-Distributed Moisture model.09 (Mac-PDM.09). The model has undergone a number of revisions since it was last applied in the hydrological literature. This paper serves to provide a detailed description of the latest version of the model. The main revisions include the following: (1) the ability for the model to be run for n repetitions, which provides more robust estimates of extreme hydrological behaviour, (2) the ability of the model to use a gridded field of coefficient of variation (CV) of daily rainfall for the stochastic disaggregation of monthly precipitation to daily precipitation, and (3) the model can now be forced with daily input climate data as well as monthly input climate data. We demonstrate the effects that each of these three revisions has on simulated runoff relative to before the revisions were applied. Importantly, we show that when Mac-PDM.09 is forced with monthly input data, it results in a negative runoff bias relative to when daily forcings are applied, for regions of the globe where the day-to-day variability in relative humidity is high. The runoff bias can be up to - 80% for a small selection of catchments but the absolute magnitude of the bias may be small. As such, we recommend future applications of Mac-PDM.09 that use monthly climate forcings acknowledge the bias as a limitation of the model. The performance of Mac-PDM.09 is evaluated by validating simulated runoff against observed runoff for 50 catchments. We also present a sensitivity analysis that demonstrates that simulated runoff is considerably more sensitive to method of PE calculation than to perturbations in soil moisture and field capacity parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently, two approaches have been introduced that distribute the molecular fragment mining problem. The first approach applies a master/worker topology, the second approach, a completely distributed peer-to-peer system, solves the scalability problem due to the bottleneck at the master node. However, in many real world scenarios the participating computing nodes cannot communicate directly due to administrative policies such as security restrictions. Thus, potential computing power is not accessible to accelerate the mining run. To solve this shortcoming, this work introduces a hierarchical topology of computing resources, which distributes the management over several levels and adapts to the natural structure of those multi-domain architectures. The most important aspect is the load balancing scheme, which has been designed and optimized for the hierarchical structure. The approach allows dynamic aggregation of heterogenous computing resources and is applied to wide area network scenarios.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ant colonies in nature provide a good model for a distributed, robust and adaptive routing algorithm. This paper proposes the adoption of the same strategy for the routing of packets in an Active Network. Traditional store-and-forward routers are replaced by active intermediate systems, which are able to perform computations on transient packets, in a way that results very helpful for developing and dynamically deploying new protocols. The adoption of the Active Networks paradigm associated with a cooperative learning environment produces a robust, decentralized routing algorithm capable of adapting to network traffic conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper focuses on improving computer network management by the adoption of artificial intelligence techniques. A logical inference system has being devised to enable automated isolation, diagnosis, and even repair of network problems, thus enhancing the reliability, performance, and security of networks. We propose a distributed multi-agent architecture for network management, where a logical reasoner acts as an external managing entity capable of directing, coordinating, and stimulating actions in an active management architecture. The active networks technology represents the lower level layer which makes possible the deployment of code which implement teleo-reactive agents, distributed across the whole network. We adopt the Situation Calculus to define a network model and the Reactive Golog language to implement the logical reasoner. An active network management architecture is used by the reasoner to inject and execute operational tasks in the network. The integrated system collects the advantages coming from logical reasoning and network programmability, and provides a powerful system capable of performing high-level management tasks in order to deal with network fault.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In real world applications sequential algorithms of data mining and data exploration are often unsuitable for datasets with enormous size, high-dimensionality and complex data structure. Grid computing promises unprecedented opportunities for unlimited computing and storage resources. In this context there is the necessity to develop high performance distributed data mining algorithms. However, the computational complexity of the problem and the large amount of data to be explored often make the design of large scale applications particularly challenging. In this paper we present the first distributed formulation of a frequent subgraph mining algorithm for discriminative fragments of molecular compounds. Two distributed approaches have been developed and compared on the well known National Cancer Institute’s HIV-screening dataset. We present experimental results on a small-scale computing environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a novel method for scoring the accuracy of protein binding site predictions – the Binding-site Distance Test (BDT) score. Recently, the Matthews Correlation Coefficient (MCC) has been used to evaluate binding site predictions, both by developers of new methods and by the assessors for the community wide prediction experiment – CASP8. Whilst being a rigorous scoring method, the MCC does not take into account the actual 3D location of the predicted residues from the observed binding site. Thus, an incorrectly predicted site that is nevertheless close to the observed binding site will obtain an identical score to the same number of nonbinding residues predicted at random. The MCC is somewhat affected by the subjectivity of determining observed binding residues and the ambiguity of choosing distance cutoffs. By contrast the BDT method produces continuous scores ranging between 0 and 1, relating to the distance between the predicted and observed residues. Residues predicted close to the binding site will score higher than those more distant, providing a better reflection of the true accuracy of predictions. The CASP8 function predictions were evaluated using both the MCC and BDT methods and the scores were compared. The BDT was found to strongly correlate with the MCC scores whilst also being less susceptible to the subjectivity of defining binding residues. We therefore suggest that this new simple score is a potentially more robust method for future evaluations of protein-ligand binding site predictions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Estimation of population size with missing zero-class is an important problem that is encountered in epidemiological assessment studies. Fitting a Poisson model to the observed data by the method of maximum likelihood and estimation of the population size based on this fit is an approach that has been widely used for this purpose. In practice, however, the Poisson assumption is seldom satisfied. Zelterman (1988) has proposed a robust estimator for unclustered data that works well in a wide class of distributions applicable for count data. In the work presented here, we extend this estimator to clustered data. The estimator requires fitting a zero-truncated homogeneous Poisson model by maximum likelihood and thereby using a Horvitz-Thompson estimator of population size. This was found to work well, when the data follow the hypothesized homogeneous Poisson model. However, when the true distribution deviates from the hypothesized model, the population size was found to be underestimated. In the search of a more robust estimator, we focused on three models that use all clusters with exactly one case, those clusters with exactly two cases and those with exactly three cases to estimate the probability of the zero-class and thereby use data collected on all the clusters in the Horvitz-Thompson estimator of population size. Loss in efficiency associated with gain in robustness was examined based on a simulation study. As a trade-off between gain in robustness and loss in efficiency, the model that uses data collected on clusters with at most three cases to estimate the probability of the zero-class was found to be preferred in general. In applications, we recommend obtaining estimates from all three models and making a choice considering the estimates from the three models, robustness and the loss in efficiency. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we list some new orthogonal main effects plans for three-level designs for 4, 5 and 6 factors in IS runs and compare them with designs obtained from the existing L-18 orthogonal array. We show that these new designs have better projection properties and can provide better parameter estimates for a range of possible models. Additionally, we study designs in other smaller run-sizes when there are insufficient resources to perform an 18-run experiment. Plans for three-level designs for 4, 5 and 6 factors in 13 to 17 runs axe given. We show that the best designs here are efficient and deserve strong consideration in many practical situations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A model for the structure of amorphous molybdenum trisulfide, a-MoS3, has been created using reverse Monte Carlo methods. This model, which consists of chains Of MoS6 units sharing three sulfurs with each of its two neighbors and forming alternate long, nonbonded, and short, bonded, Mo-Mo separations, is a good fit to the neutron diffraction data and is chemically and physically realistic. The paper identifies the limitations of previous models based on Mo-3 triangular clusters in accounting for the available experimental data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ellipsometry and atomic force microscopy (AFM) were used to study the film thickness and the surface roughness of both 'soft' and solid thin films. 'Soft' polymer thin films of polystyrene and poly(styrene-ethylene/butylene-styrene) block copolymer were prepared by spin-coating onto planar silicon wafers. Ellipsometric parameters were fitted by the Cauchy approach using a two-layer model with planar boundaries between the layers. The smooth surfaces of the prepared polymer films were confirmed by AFM. There is good agreement between AFM and ellipsometry in the 80-130 nm thickness range. Semiconductor surfaces (Si) obtained by anisotropic chemical etching were investigated as an example of a randomly rough surface. To define roughness parameters by ellipsometry, the top rough layers were treated as thin films according to the Bruggeman effective medium approximation (BEMA). Surface roughness values measured by AFM and ellipsometry show the same tendency of increasing roughness with increased etching time, although AFM results depend on the used window size. The combined use of both methods appears to offer the most comprehensive route to quantitative surface roughness characterisation of solid films. Copyright (c) 2007 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The morphology in the solid state of a series of triblock copolymers comprising a poly(ethylene glycol) (PEG) midblock and symmetric poly(gamma-benzyl-L-glutamate) (PBLG) end blocks has been studied using X-ray scattering and microscopy techniques. Transmission electron microscopy (TEM) on samples selectively stained with uranyl acetate provided clear assignment of morphologies for as-cast and annealed samples. The thickness of both PEG and PBLG domains was in good agreement with calculations based on the conformations of the respective chains, allowing for the crystal or amorphous state of PEG and the a-helical or P-sheet structure of the PBLG. Atomic force microscopy provided complementary information on surface morphology for several samples that was in good agreement with the structure observed by TEM. A morphology diagram was constructed. Cylindrical structures were observed for ordered samples with low f(PBLG), whereas at higher f(PLBG) there was evidence for broken lamellar and "hockey puck" nanostructures. Regular lamellae were observed for intermediate compositions.