20 resultados para HEURISTIC-SEARCH APPROACH
em CentAUR: Central Archive University of Reading - UK
Resumo:
There are three key driving forces behind the development of Internet Content Management Systems (CMS) - a desire to manage the explosion of content, a desire to provide structure and meaning to content in order to make it accessible, and a desire to work collaboratively to manipulate content in some meaningful way. Yet the traditional CMS has been unable to meet the latter of these requirements, often failing to provide sufficient tools for collaboration in a distributed context. Peer-to-Peer (P2P) systems are networks in which every node is an equal participant (whether transmitting data, exchanging content, or invoking services) and there is an absence of any centralised administrative or coordinating authorities. P2P systems are inherently more scalable than equivalent client-server implementations as they tend to use resources at the edge of the network much more effectively. This paper details the rationale and design of a P2P middleware for collaborative content management.
Resumo:
Competency management is a very important part of a well-functioning organisation. Unfortunately competency descriptions are not uniformly specified nor defined across borders: National, sectorial or organisational, leading to an opaque competency description market with a multitude of competency frameworks and competency benchmarks. An ontology is a formalised description of a domain, which enables automated reasoning engines to be built which by utilising the interrelations between entities can make “intelligent” choices in different situations within the domain. Introducing formalised competency ontologies automated tools, such as skill gap analysis, training suggestion generation, job search and recruitment, can be developed, which compare and contrast different competency descriptions on the semantic level. The major problem with defining a common formalised ontology for competencies is that there are so many viewpoints of competencies and competency frameworks. Work within the TRACE project has focused on finding common trends within different competency frameworks in order to allow an intermediate competency description to be made, which other frameworks can reference. This research has shown that competencies can be divided up into “knowledge”, “skills” and what we call “others”. An ontology has been created based on this with a simple structure of different “kinds” of “knowledges” and “skills” using semantic interrelations to define the basic semantic structure of the ontology. A prototype tool for analysing a skill gap analysis has been developed. Personal profiles can be produced using the tool and a skill gap analysis is performed on a desired competency profile by using an ontologically based inference engine, which is able to list closest fit and possible proficiency gaps
Resumo:
Holocene tidal palaoechannels, Severn Estuary Levels, UK: a search for granulometric and foraminiferal criteria. Proceedings of the Geologists' Association, 117, 329-344. Grain-size characteristics (by laser granulometry) and foraminiferal assemblages have been established for silts accumulated in five, dissimilar tidal palaeochannels of mid or late Holocene age in the Severn Estuary Levels, representative of muddy tidal systems. For purposes of general comparison, similar data were obtained from a representative active tidal inlet in the area, but all of these channels have been subject to human interference and are not relied upon as a model for environmental interpretation. Although the palaeochannel deposits differ substantially in their bedding characteristics and stratigraphical relationships from the level-bedded salt-marsh platform and mudflat deposits with which they are associated, and although the channel environment is distinctive morphologically and hydraulically, no critical textural differences could be found between the channel deposits and the associated facies. Similarly, no foraminiferal assemblages distinctive of a tidal channel were encountered. Instead, the assemblages compare with those from mudflats and salt-marsh platforms. It is concluded that the sides of the subfossil channels carried some vegetation, as was observed to be the case in the modern inlet. An alternative approach is necessary if concealed palaeochannel deposits are to be recognized in muddy systems from limited numbers of subsurface samples. Although the palaeochannels afforded no characteristic textural signature, they yield transverse grain-size patterns pointing to coastal movements during their evolution. Concave-up trends suggest outward coastal building, whereas convex-up ones point to marsh-edge retreat.
Resumo:
Empirical orthogonal functions (EOFs) are widely used in climate research to identify dominant patterns of variability and to reduce the dimensionality of climate data. EOFs, however, can be difficult to interpret. Rotated empirical orthogonal functions (REOFs) have been proposed as more physical entities with simpler patterns than EOFs. This study presents a new approach for finding climate patterns with simple structures that overcomes the problems encountered with rotation. The method achieves simplicity of the patterns by using the main properties of EOFs and REOFs simultaneously. Orthogonal patterns that maximise variance subject to a constraint that induces a form of simplicity are found. The simplified empirical orthogonal function (SEOF) patterns, being more 'local'. are constrained to have zero loadings outside the main centre of action. The method is applied to winter Northern Hemisphere (NH) monthly mean sea level pressure (SLP) reanalyses over the period 1948-2000. The 'simplified' leading patterns of variability are identified and compared to the leading patterns obtained from EOFs and REOFs. Copyright (C) 2005 Royal Meteorological Society.
Resumo:
Two models for predicting Septoria tritici on winter wheat (cv. Ri-band) were developed using a program based on an iterative search of correlations between disease severity and weather. Data from four consecutive cropping seasons (1993/94 until 1996/97) at nine sites throughout England were used. A qualitative model predicted the presence or absence of Septoria tritici (at a 5% severity threshold within the top three leaf layers) using winter temperature (January/February) and wind speed to about the first node detectable growth stage. For sites above the disease threshold, a quantitative model predicted severity of Septoria tritici using rainfall during stern elongation. A test statistic was derived to test the validity of the iterative search used to obtain both models. This statistic was used in combination with bootstrap analyses in which the search program was rerun using weather data from previous years, therefore uncorrelated with the disease data, to investigate how likely correlations such as the ones found in our models would have been in the absence of genuine relationships.
Resumo:
Estimation of population size with missing zero-class is an important problem that is encountered in epidemiological assessment studies. Fitting a Poisson model to the observed data by the method of maximum likelihood and estimation of the population size based on this fit is an approach that has been widely used for this purpose. In practice, however, the Poisson assumption is seldom satisfied. Zelterman (1988) has proposed a robust estimator for unclustered data that works well in a wide class of distributions applicable for count data. In the work presented here, we extend this estimator to clustered data. The estimator requires fitting a zero-truncated homogeneous Poisson model by maximum likelihood and thereby using a Horvitz-Thompson estimator of population size. This was found to work well, when the data follow the hypothesized homogeneous Poisson model. However, when the true distribution deviates from the hypothesized model, the population size was found to be underestimated. In the search of a more robust estimator, we focused on three models that use all clusters with exactly one case, those clusters with exactly two cases and those with exactly three cases to estimate the probability of the zero-class and thereby use data collected on all the clusters in the Horvitz-Thompson estimator of population size. Loss in efficiency associated with gain in robustness was examined based on a simulation study. As a trade-off between gain in robustness and loss in efficiency, the model that uses data collected on clusters with at most three cases to estimate the probability of the zero-class was found to be preferred in general. In applications, we recommend obtaining estimates from all three models and making a choice considering the estimates from the three models, robustness and the loss in efficiency. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)
Resumo:
Summary: The program LVB seeks parsimonious phylogenies from nucleotide alignments, using the simulated annealing heuristic. LVB runs fast and gives high quality results.
Resumo:
Purpose - The purpose of this paper is to identify the most popular techniques used to rank a web page highly in Google. Design/methodology/approach - The paper presents the results of a study into 50 highly optimized web pages that were created as part of a Search Engine Optimization competition. The study focuses on the most popular techniques that were used to rank highest in this competition, and includes an analysis on the use of PageRank, number of pages, number of in-links, domain age and the use of third party sites such as directories and social bookmarking sites. A separate study was made into 50 non-optimized web pages for comparison. Findings - The paper provides insight into the techniques that successful Search Engine Optimizers use to ensure a page ranks highly in Google. Recognizes the importance of PageRank and links as well as directories and social bookmarking sites. Research limitations/implications - Only the top 50 web sites for a specific query were analyzed. Analysing more web sites and comparing with similar studies in different competition would provide more concrete results. Practical implications - The paper offers a revealing insight into the techniques used by industry experts to rank highly in Google, and the success or other-wise of those techniques. Originality/value - This paper fulfils an identified need for web sites and e-commerce sites keen to attract a wider web audience.
Resumo:
We introduce and describe the Multiple Gravity Assist problem, a global optimisation problem that is of great interest in the design of spacecraft and their trajectories. We discuss its formalization and we show, in one particular problem instance, the performance of selected state of the art heuristic global optimisation algorithms. A deterministic search space pruning algorithm is then developed and its polynomial time and space complexity derived. The algorithm is shown to achieve search space reductions of greater than six orders of magnitude, thus reducing significantly the complexity of the subsequent optimisation.
Resumo:
We introduce a classification-based approach to finding occluding texture boundaries. The classifier is composed of a set of weak learners, which operate on image intensity discriminative features that are defined on small patches and are fast to compute. A database that is designed to simulate digitized occluding contours of textured objects in natural images is used to train the weak learners. The trained classifier score is then used to obtain a probabilistic model for the presence of texture transitions, which can readily be used for line search texture boundary detection in the direction normal to an initial boundary estimate. This method is fast and therefore suitable for real-time and interactive applications. It works as a robust estimator, which requires a ribbon-like search region and can handle complex texture structures without requiring a large number of observations. We demonstrate results both in the context of interactive 2D delineation and of fast 3D tracking and compare its performance with other existing methods for line search boundary detection.
Resumo:
1. Reductions in resource availability, associated with land-use change and agricultural intensification in the UK and Europe, have been linked with the widespread decline of many farmland bird species over recent decades. However, the underlying ecological processes which link resource availability and population trends are poorly understood. 2. We construct a spatial depletion model to investigate the relationship between the population persistence of granivorous birds within the agricultural landscape and the temporal dynamics of stubble field availability, an important source of winter food for many of those species. 3. The model is capable of accurately predicting the distribution of a given number of finches and buntings amongst patches of different stubble types in an agricultural landscape over the course of a winter and assessing the relative value of different landscapes in terms of resource availability. 4. Sensitivity analyses showed that the model is relatively robust to estimates of energetic requirements, search efficiency and handling time but that daily seed survival estimates have a strong influence on model fit. Understanding resource dynamics in agricultural landscapes is highlighted as a key area for further research. 5. There was a positive relationship between the predicted number of bird days supported by a landscape over-winter and the breeding population trend for yellowhammer Emberiza citrinella, a species for which survival has been identified as the primary driver of population dynamics, but not for linnet Carduelis cannabina, a species for which productivity has been identified as the primary driver of population dynamics. 6. Synthesis and applications. We believe this model can be used to guide the effective delivery of over-winter food resources under agri-environment schemes and to assess the impacts on granivorous birds of changing resource availability associated with novel changes in land use. This could be very important in the future as farming adapts to an increasingly dynamic trading environment, in which demands for increased agricultural production must be reconciled with objectives for environmental protection, including biodiversity conservation.
Resumo:
While search is normally modelled by economists purely in terms of decisions over making observations, this paper models it as a process in which information is gained through feedback from innovatory product launches. The information gained can then be used to decide whether to exercise real options. In the model the initial decisions involve a product design and the scale of production capacity. There are then real options to change these factors based on what is learned. The case of launching product variants in parallel is also considered. Under ‘true’ uncertainty, the model can be seen in terms of heuristic decision-making based on subjective beliefs with limited foresight. Search costs, the values of the real options, beliefs, and the cost of capital are all shown to be significant in determining the search path.
Resumo:
The Stochastic Diffusion Search (SDS) was developed as a solution to the best-fit search problem. Thus, as a special case it is capable of solving the transform invariant pattern recognition problem. SDS is efficient and, although inherently probabilistic, produces very reliable solutions in widely ranging search conditions. However, to date a systematic formal investigation of its properties has not been carried out. This thesis addresses this problem. The thesis reports results pertaining to the global convergence of SDS as well as characterising its time complexity. However, the main emphasis of the work, reports on the resource allocation aspect of the Stochastic Diffusion Search operations. The thesis introduces a novel model of the algorithm, generalising an Ehrenfest Urn Model from statistical physics. This approach makes it possible to obtain a thorough characterisation of the response of the algorithm in terms of the parameters describing the search conditions in case of a unique best-fit pattern in the search space. This model is further generalised in order to account for different search conditions: two solutions in the search space and search for a unique solution in a noisy search space. Also an approximate solution in the case of two alternative solutions is proposed and compared with predictions of the extended Ehrenfest Urn model. The analysis performed enabled a quantitative characterisation of the Stochastic Diffusion Search in terms of exploration and exploitation of the search space. It appeared that SDS is biased towards the latter mode of operation. This novel perspective on the Stochastic Diffusion Search lead to an investigation of extensions of the standard SDS, which would strike a different balance between these two modes of search space processing. Thus, two novel algorithms were derived from the standard Stochastic Diffusion Search, ‘context-free’ and ‘context-sensitive’ SDS, and their properties were analysed with respect to resource allocation. It appeared that they shared some of the desired features of their predecessor but also possessed some properties not present in the classic SDS. The theory developed in the thesis was illustrated throughout with carefully chosen simulations of a best-fit search for a string pattern, a simple but representative domain, enabling careful control of search conditions.
Resumo:
The Court of Justice has, over the years, often been vilified for exceeding the limits of its jurisdiction by interpreting the provisions of Community legislation in a way not seem originally envisaged by its drafters. A recent example of this approach was a cluster of cases in the context of the free movement of workers and the freedom of establishment (Ritter-Coulais and its progeny), where the Court included within the scope of those provisions situations which, arguably, did not present a sufficient link with their (economic) aim. In particular, in that case law the Court accepted that the mere exercise of free movement for the purpose of taking up residence in the territory of another Member State whilst continuing to exercise an economic activity in the State of origin, suffices for bringing a Member State national within the scope of Articles 39 and 43 EC. It is argued that the most plausible explanation for this approach is that the Court now wishes to re-read the economic fundamental freedoms in such a way as to include within their scope all economically active Union citizens, irrespective of whether their situation presents a sufficient link with the exercise of an economic activity in a cross-border context. It is suggested that this approach is problematic for a number of reasons. It is, therefore, concluded that the Court should revert to its orthodox approach, according to which only situations that involve Union citizens who have moved between Member States for the purpose of taking up an economic activity should be included within the scope of the market freedoms.
Resumo:
In this paper a support vector machine (SVM) approach for characterizing the feasible parameter set (FPS) in non-linear set-membership estimation problems is presented. It iteratively solves a regression problem from which an approximation of the boundary of the FPS can be determined. To guarantee convergence to the boundary the procedure includes a no-derivative line search and for an appropriate coverage of points on the FPS boundary it is suggested to start with a sequential box pavement procedure. The SVM approach is illustrated on a simple sine and exponential model with two parameters and an agro-forestry simulation model.