898 resultados para Deterministic imputation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A dynamic, deterministic, economic simulation model was developed to estimate the costs and benefits of controlling Mycobacterium avium subsp. paratuberculosis (Johne's disease) in a suckler beef herd. The model is intended as a demonstration tool for veterinarians to use with farmers. The model design process involved user consultation and participation and the model is freely accessible on a dedicated website. The 'user-friendly' model interface allows the input of key assumptions and farm specific parameters enabling model simulations to be tailored to individual farm circumstances. The model simulates the effect of Johne's disease and various measures for its control in terms of herd prevalence and the shedding states of animals within the herd, the financial costs of the disease and of any control measures and the likely benefits of control of Johne's disease for the beef suckler herd over a 10-year period. The model thus helps to make more transparent the 'hidden costs' of Johne's in a herd and the likely benefits to be gained from controlling the disease. The control strategies considered within the model are 'no control', 'testing and culling of diagnosed animals', 'improving management measures' or a dual strategy of 'testing and culling in association with improving management measures'. An example 'run' of the model shows that the strategy 'improving management measures', which reduces infection routes during the early stages, results in a marked fall in herd prevalence and total costs. Testing and culling does little to reduce prevalence and does not reduce total costs over the 10-year period.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Process-based integrated modelling of weather and crop yield over large areas is becoming an important research topic. The production of the DEMETER ensemble hindcasts of weather allows this work to be carried out in a probabilistic framework. In this study, ensembles of crop yield (groundnut, Arachis hypogaea L.) were produced for 10 2.5 degrees x 2.5 degrees grid cells in western India using the DEMETER ensembles and the general large-area model (GLAM) for annual crops. Four key issues are addressed by this study. First, crop model calibration methods for use with weather ensemble data are assessed. Calibration using yield ensembles was more successful than calibration using reanalysis data (the European Centre for Medium-Range Weather Forecasts 40-yr reanalysis, ERA40). Secondly, the potential for probabilistic forecasting of crop failure is examined. The hindcasts show skill in the prediction of crop failure, with more severe failures being more predictable. Thirdly, the use of yield ensemble means to predict interannual variability in crop yield is examined and their skill assessed relative to baseline simulations using ERA40. The accuracy of multi-model yield ensemble means is equal to or greater than the accuracy using ERA40. Fourthly, the impact of two key uncertainties, sowing window and spatial scale, is briefly examined. The impact of uncertainty in the sowing window is greater with ERA40 than with the multi-model yield ensemble mean. Subgrid heterogeneity affects model accuracy: where correlations are low on the grid scale, they may be significantly positive on the subgrid scale. The implications of the results of this study for yield forecasting on seasonal time-scales are as follows. (i) There is the potential for probabilistic forecasting of crop failure (defined by a threshold yield value); forecasting of yield terciles shows less potential. (ii) Any improvement in the skill of climate models has the potential to translate into improved deterministic yield prediction. (iii) Whilst model input uncertainties are important, uncertainty in the sowing window may not require specific modelling. The implications of the results of this study for yield forecasting on multidecadal (climate change) time-scales are as follows. (i) The skill in the ensemble mean suggests that the perturbation, within uncertainty bounds, of crop and climate parameters, could potentially average out some of the errors associated with mean yield prediction. (ii) For a given technology trend, decadal fluctuations in the yield-gap parameter used by GLAM may be relatively small, implying some predictability on those time-scales.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite decades of research, it remains controversial whether ecological communities converge towards a common structure determined by environmental conditions irrespective of assembly history. Here, we show experimentally that the answer depends on the level of community organization considered. In a 9-year grassland experiment, we manipulated initial plant composition on abandoned arable land and subsequently allowed natural colonization. Initial compositional variation caused plant communities to remain divergent in species identities, even though these same communities converged strongly in species traits. This contrast between species divergence and trait convergence could not be explained by dispersal limitation or community neutrality alone. Our results show that the simultaneous operation of trait-based assembly rules and species-level priority effects drives community assembly, making it both deterministic and historically contingent, but at different levels of community organization.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Accurately and reliably identifying the actual number of clusters present with a dataset of gene expression profiles, when no additional information on cluster structure is available, is a problem addressed by few algorithms. GeneMCL transforms microarray analysis data into a graph consisting of nodes connected by edges, where the nodes represent genes, and the edges represent the similarity in expression of those genes, as given by a proximity measurement. This measurement is taken to be the Pearson correlation coefficient combined with a local non-linear rescaling step. The resulting graph is input to the Markov Cluster (MCL) algorithm, which is an elegant, deterministic, non-specific and scalable method, which models stochastic flow through the graph. The algorithm is inherently affected by any cluster structure present, and rapidly decomposes a graph into cohesive clusters. The potential of the GeneMCL algorithm is demonstrated with a 5730 gene subset (IGS) of the Van't Veer breast cancer database, for which the clusterings are shown to reflect underlying biological mechanisms. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The constructivist model of 'soft' value management (VM) is contrasted with the VM discourse appropriated by cost consultants who operate from within UK quantity surveying (QS) practices. The enactment of VM by cost consultants is shaped by the institutional context within which they operate and is not necessarily representative of VM practice per se. Opportunities to perform VM during the formative stages of design are further constrained by the positivistic rhetoric that such practitioners use to conceptualize and promote their services. The complex interplay between VM theory and practice is highlighted and analysed from a non-deterministic perspective. Codified models of 'best practice' are seen to be socially constructed and legitimized through human interaction in the context of interorganizational networks. Published methodologies are seen to inform practice in only a loose and indirect manner, with extensive scope for localized improvization. New insights into the relationship between VM theory and practice are derived from the dramaturgical metaphor. The social reality of VM is seen to be constituted through scripts and performances, both of which are continuously contested across organizational arenas. It is concluded that VM defies universal definition and is conceptualized and enacted differently across different localized contexts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current literature offers little understanding about how procurement methods are enacted in practice. Developments in procurement are often viewed as the result of responding to recommendations from particular constituents within the sector. The research seeks to remove itself from such deterministic leaning, counselling instead that procurement should not be viewed in static terms, but dynamically manifesting over time within a complex web of interconnections between various actors, their situated context and the broader industrial structure. Attention is given to how a client and construction firm engaged in a collusive interaction to realise an innovative procurement method that derived its legitimacy from a backcloth of initiatives promoted by various commentators. A case study of a medium-size regional contractor demonstrates how the first partnering arrangement was enacted within the UK affordable housing maintenance sector in the UK. The case study finds that the enactment of new procurement methods strongly relies on iterative learning between clients and contractors. It is further suggested that construction firms need to initiate new procurement in order to remain competitive within the sector. The findings point towards a pro-active initiative by the contractor and client to enact a ‘procurement first’. Encouragement may be drawn from this example by other contractors seeking to offer more than simply responsive procurement solutions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The application of prediction theories has been widely practised for many years in many industries such as manufacturing, defence and aerospace. Although these theories are not new, their application has not been widely used within the building services industry. Collectively, the building services industry should take a deeper look at these approaches in comparison with the traditional deterministic approaches currently being practised. By extending the application into this industry, this paper seeks to provide the industry with an overview of how simplified stochastic modelling coupled with availability and reliability predictions using historical data compiled from various sources could enhance the quality of building services systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

More than thirty years ago, Amari and colleagues proposed a statistical framework for identifying structurally stable macrostates of neural networks from observations of their microstates. We compare their stochastic stability criterion with a deterministic stability criterion based on the ergodic theory of dynamical systems, recently proposed for the scheme of contextual emergence and applied to particular inter-level relations in neuroscience. Stochastic and deterministic stability criteria for macrostates rely on macro-level contexts, which make them sensitive to differences between different macro-levels.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A fast Knowledge-based Evolution Strategy, KES, for the multi-objective minimum spanning tree, is presented. The proposed algorithm is validated, for the bi-objective case, with an exhaustive search for small problems (4-10 nodes), and compared with a deterministic algorithm, EPDA and NSGA-II for larger problems (up to 100 nodes) using benchmark hard instances. Experimental results show that KES finds the true Pareto fronts for small instances of the problem and calculates good approximation Pareto sets for larger instances tested. It is shown that the fronts calculated by YES are superior to NSGA-II fronts and almost as good as those established by EPDA. KES is designed to be scalable to multi-objective problems and fast due to its small complexity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A hybridised and Knowledge-based Evolutionary Algorithm (KEA) is applied to the multi-criterion minimum spanning tree problems. Hybridisation is used across its three phases. In the first phase a deterministic single objective optimization algorithm finds the extreme points of the Pareto front. In the second phase a K-best approach finds the first neighbours of the extreme points, which serve as an elitist parent population to an evolutionary algorithm in the third phase. A knowledge-based mutation operator is applied in each generation to reproduce individuals that are at least as good as the unique parent. The advantages of KEA over previous algorithms include its speed (making it applicable to large real-world problems), its scalability to more than two criteria, and its ability to find both the supported and unsupported optimal solutions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Exact error estimates for evaluating multi-dimensional integrals are considered. An estimate is called exact if the rates of convergence for the low- and upper-bound estimate coincide. The algorithm with such an exact rate is called optimal. Such an algorithm has an unimprovable rate of convergence. The problem of existing exact estimates and optimal algorithms is discussed for some functional spaces that define the regularity of the integrand. Important for practical computations data classes are considered: classes of functions with bounded derivatives and Holder type conditions. The aim of the paper is to analyze the performance of two optimal classes of algorithms: deterministic and randomized for computing multidimensional integrals. It is also shown how the smoothness of the integrand can be exploited to construct better randomized algorithms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

When a computer program requires legitimate access to confidential data, the question arises whether such a program may illegally reveal sensitive information. This paper proposes a policy model to specify what information flow is permitted in a computational system. The security definition, which is based on a general notion of information lattices, allows various representations of information to be used in the enforcement of secure information flow in deterministic or nondeterministic systems. A flexible semantics-based analysis technique is presented, which uses the input-output relational model induced by an attacker's observational power, to compute the information released by the computational system. An illustrative attacker model demonstrates the use of the technique to develop a termination-sensitive analysis. The technique allows the development of various information flow analyses, parametrised by the attacker's observational power, which can be used to enforce what declassification policies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we consider hybrid (fast stochastic approximation and deterministic refinement) algorithms for Matrix Inversion (MI) and Solving Systems of Linear Equations (SLAE). Monte Carlo methods are used for the stochastic approximation, since it is known that they are very efficient in finding a quick rough approximation of the element or a row of the inverse matrix or finding a component of the solution vector. We show how the stochastic approximation of the MI can be combined with a deterministic refinement procedure to obtain MI with the required precision and further solve the SLAE using MI. We employ a splitting A = D – C of a given non-singular matrix A, where D is a diagonal dominant matrix and matrix C is a diagonal matrix. In our algorithm for solving SLAE and MI different choices of D can be considered in order to control the norm of matrix T = D –1C, of the resulting SLAE and to minimize the number of the Markov Chains required to reach given precision. Further we run the algorithms on a mini-Grid and investigate their efficiency depending on the granularity. Corresponding experimental results are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In any data mining applications, automated text and text and image retrieval of information is needed. This becomes essential with the growth of the Internet and digital libraries. Our approach is based on the latent semantic indexing (LSI) and the corresponding term-by-document matrix suggested by Berry and his co-authors. Instead of using deterministic methods to find the required number of first "k" singular triplets, we propose a stochastic approach. First, we use Monte Carlo method to sample and to build much smaller size term-by-document matrix (e.g. we build k x k matrix) from where we then find the first "k" triplets using standard deterministic methods. Second, we investigate how we can reduce the problem to finding the "k"-largest eigenvalues using parallel Monte Carlo methods. We apply these methods to the initial matrix and also to the reduced one. The algorithms are running on a cluster of workstations under MPI and results of the experiments arising in textual retrieval of Web documents as well as comparison of the stochastic methods proposed are presented. (C) 2003 IMACS. Published by Elsevier Science B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The question "what Monte Carlo models can do and cannot do efficiently" is discussed for some functional spaces that define the regularity of the input data. Data classes important for practical computations are considered: classes of functions with bounded derivatives and Holder type conditions, as well as Korobov-like spaces. Theoretical performance analysis of some algorithms with unimprovable rate of convergence is given. Estimates of computational complexity of two classes of algorithms - deterministic and randomized for both problems - numerical multidimensional integration and calculation of linear functionals of the solution of a class of integral equations are presented. (c) 2007 Elsevier Inc. All rights reserved.