12 resultados para MODEL SEARCH

em CentAUR: Central Archive University of Reading - UK


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Holocene tidal palaoechannels, Severn Estuary Levels, UK: a search for granulometric and foraminiferal criteria. Proceedings of the Geologists' Association, 117, 329-344. Grain-size characteristics (by laser granulometry) and foraminiferal assemblages have been established for silts accumulated in five, dissimilar tidal palaeochannels of mid or late Holocene age in the Severn Estuary Levels, representative of muddy tidal systems. For purposes of general comparison, similar data were obtained from a representative active tidal inlet in the area, but all of these channels have been subject to human interference and are not relied upon as a model for environmental interpretation. Although the palaeochannel deposits differ substantially in their bedding characteristics and stratigraphical relationships from the level-bedded salt-marsh platform and mudflat deposits with which they are associated, and although the channel environment is distinctive morphologically and hydraulically, no critical textural differences could be found between the channel deposits and the associated facies. Similarly, no foraminiferal assemblages distinctive of a tidal channel were encountered. Instead, the assemblages compare with those from mudflats and salt-marsh platforms. It is concluded that the sides of the subfossil channels carried some vegetation, as was observed to be the case in the modern inlet. An alternative approach is necessary if concealed palaeochannel deposits are to be recognized in muddy systems from limited numbers of subsurface samples. Although the palaeochannels afforded no characteristic textural signature, they yield transverse grain-size patterns pointing to coastal movements during their evolution. Concave-up trends suggest outward coastal building, whereas convex-up ones point to marsh-edge retreat.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genetic polymorphisms in deoxyribonucleic acid coding regions may have a phenotypic effect on the carrier, e.g. by influencing susceptibility to disease. Detection of deleterious mutations via association studies is hampered by the large number of candidate sites; therefore methods are needed to narrow down the search to the most promising sites. For this, a possible approach is to use structural and sequence-based information of the encoded protein to predict whether a mutation at a particular site is likely to disrupt the functionality of the protein itself. We propose a hierarchical Bayesian multivariate adaptive regression spline (BMARS) model for supervised learning in this context and assess its predictive performance by using data from mutagenesis experiments on lac repressor and lysozyme proteins. In these experiments, about 12 amino-acid substitutions were performed at each native amino-acid position and the effect on protein functionality was assessed. The training data thus consist of repeated observations at each position, which the hierarchical framework is needed to account for. The model is trained on the lac repressor data and tested on the lysozyme mutations and vice versa. In particular, we show that the hierarchical BMARS model, by allowing for the clustered nature of the data, yields lower out-of-sample misclassification rates compared with both a BMARS and a frequen-tist MARS model, a support vector machine classifier and an optimally pruned classification tree.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce a classification-based approach to finding occluding texture boundaries. The classifier is composed of a set of weak learners, which operate on image intensity discriminative features that are defined on small patches and are fast to compute. A database that is designed to simulate digitized occluding contours of textured objects in natural images is used to train the weak learners. The trained classifier score is then used to obtain a probabilistic model for the presence of texture transitions, which can readily be used for line search texture boundary detection in the direction normal to an initial boundary estimate. This method is fast and therefore suitable for real-time and interactive applications. It works as a robust estimator, which requires a ribbon-like search region and can handle complex texture structures without requiring a large number of observations. We demonstrate results both in the context of interactive 2D delineation and of fast 3D tracking and compare its performance with other existing methods for line search boundary detection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While search is normally modelled by economists purely in terms of decisions over making observations, this paper models it as a process in which information is gained through feedback from innovatory product launches. The information gained can then be used to decide whether to exercise real options. In the model the initial decisions involve a product design and the scale of production capacity. There are then real options to change these factors based on what is learned. The case of launching product variants in parallel is also considered. Under ‘true’ uncertainty, the model can be seen in terms of heuristic decision-making based on subjective beliefs with limited foresight. Search costs, the values of the real options, beliefs, and the cost of capital are all shown to be significant in determining the search path.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider a non-local version of the NJL model, based on a separable quark-quark interaction. The interaction is extended to include terms that bind vector and axial-vector mesons. The non-locality means that no further regulator is required. Moreover the model is able to confine the quarks by generating a quark propagator without poles at real energies. Working in the ladder approximation, we calculate amplitudes in Euclidean space and discuss features of their continuation to Minkowski energies. Conserved currents are constructed and we demonstrate their consistency with various Ward identities. Various meson masses are calculated, along with their strong and electromagnetic decay amplitudes. We also calculate the electromagnetic form factor of the pion, as well as form factors associated with the processes γγ* → π0 and ω → π0γ*. The results are found to lead to a satisfactory phenomenology and lend some dynamical support to the idea of vector-meson dominance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Stochastic Diffusion Search (SDS) was developed as a solution to the best-fit search problem. Thus, as a special case it is capable of solving the transform invariant pattern recognition problem. SDS is efficient and, although inherently probabilistic, produces very reliable solutions in widely ranging search conditions. However, to date a systematic formal investigation of its properties has not been carried out. This thesis addresses this problem. The thesis reports results pertaining to the global convergence of SDS as well as characterising its time complexity. However, the main emphasis of the work, reports on the resource allocation aspect of the Stochastic Diffusion Search operations. The thesis introduces a novel model of the algorithm, generalising an Ehrenfest Urn Model from statistical physics. This approach makes it possible to obtain a thorough characterisation of the response of the algorithm in terms of the parameters describing the search conditions in case of a unique best-fit pattern in the search space. This model is further generalised in order to account for different search conditions: two solutions in the search space and search for a unique solution in a noisy search space. Also an approximate solution in the case of two alternative solutions is proposed and compared with predictions of the extended Ehrenfest Urn model. The analysis performed enabled a quantitative characterisation of the Stochastic Diffusion Search in terms of exploration and exploitation of the search space. It appeared that SDS is biased towards the latter mode of operation. This novel perspective on the Stochastic Diffusion Search lead to an investigation of extensions of the standard SDS, which would strike a different balance between these two modes of search space processing. Thus, two novel algorithms were derived from the standard Stochastic Diffusion Search, ‘context-free’ and ‘context-sensitive’ SDS, and their properties were analysed with respect to resource allocation. It appeared that they shared some of the desired features of their predecessor but also possessed some properties not present in the classic SDS. The theory developed in the thesis was illustrated throughout with carefully chosen simulations of a best-fit search for a string pattern, a simple but representative domain, enabling careful control of search conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a recent study, Williams introduced a simple modification to the widely used Robert–Asselin (RA) filter for numerical integration. The main purpose of the Robert–Asselin–Williams (RAW) filter is to avoid the undesired numerical damping of the RA filter and to increase the accuracy. In the present paper, the effects of the modification are comprehensively evaluated in the Simplified Parameterizations, Primitive Equation Dynamics (SPEEDY) atmospheric general circulation model. First, the authors search for significant changes in the monthly climatology due to the introduction of the new filter. After testing both at the local level and at the field level, no significant changes are found, which is advantageous in the sense that the new scheme does not require a retuning of the parameterized model physics. Second, the authors examine whether the new filter improves the skill of short- and medium-term forecasts. January 1982 data from the NCEP–NCAR reanalysis are used to evaluate the forecast skill. Improvements are found in all the model variables (except the relative humidity, which is hardly changed). The improvements increase with lead time and are especially evident in medium-range forecasts (96–144 h). For example, in tropical surface pressure predictions, 5-day forecasts made using the RAW filter have approximately the same skill as 4-day forecasts made using the RA filter. The results of this work are encouraging for the implementation of the RAW filter in other models currently using the RA filter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report presents the canonical Hamiltonian formulation of relative satellite motion. The unperturbed Hamiltonian model is shown to be equivalent to the well known Hill-Clohessy-Wilshire (HCW) linear formulation. The in°uence of perturbations of the nonlinear Gravitational potential and the oblateness of the Earth; J2 perturbations are also modelled within the Hamiltonian formulation. The modelling incorporates eccentricity of the reference orbit. The corresponding Hamiltonian vector ¯elds are computed and implemented in Simulink. A numerical method is presented aimed at locating periodic or quasi-periodic relative satellite motion. The numerical method outlined in this paper is applied to the Hamiltonian system. Although the orbits considered here are weakly unstable at best, in the case of eccentricity only, the method ¯nds exact periodic orbits. When other perturbations such as nonlinear gravitational terms are added, drift is signicantly reduced and in the case of the J2 perturbation with and without the nonlinear gravitational potential term, bounded quasi-periodic solutions are found. Advantages of using Newton's method to search for periodic or quasi-periodic relative satellite motion include simplicity of implementation, repeatability of solutions due to its non-random nature, and fast convergence. Given that the use of bounded or drifting trajectories as control references carries practical di±culties over long-term missions, Principal Component Analysis (PCA) is applied to the quasi-periodic or slowly drifting trajectories to help provide a closed reference trajectory for the implementation of closed loop control. In order to evaluate the e®ect of the quality of the model used to generate the periodic reference trajectory, a study involving closed loop control of a simulated master/follower formation was performed. 2 The results of the closed loop control study indicate that the quality of the model employed for generating the reference trajectory used for control purposes has an important in°uence on the resulting amount of fuel required to track the reference trajectory. The model used to generate LQR controller gains also has an e®ect on the e±ciency of the controller.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A class identification algorithms is introduced for Gaussian process(GP)models.The fundamental approach is to propose a new kernel function which leads to a covariance matrix with low rank,a property that is consequently exploited for computational efficiency for both model parameter estimation and model predictions.The objective of either maximizing the marginal likelihood or the Kullback–Leibler (K–L) divergence between the estimated output probability density function(pdf)and the true pdf has been used as respective cost functions.For each cost function,an efficient coordinate descent algorithm is proposed to estimate the kernel parameters using a one dimensional derivative free search, and noise variance using a fast gradient descent algorithm. Numerical examples are included to demonstrate the effectiveness of the new identification approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A model based on graph isomorphisms is used to formalize software evolution. Step by step we narrow the search space by an informed selection of the attributes based on the current state-of-the-art in software engineering and generate a seed solution. We then traverse the resulting space using graph isomorphisms and other set operations over the vertex sets. The new solutions will preserve the desired attributes. The goal of defining an isomorphism based search mechanism is to construct predictors of evolution that can facilitate the automation of ’software factory’ paradigm. The model allows for automation via software tools implementing the concepts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A model based on graph isomorphisms is used to formalize software evolution. Step by step we narrow the search space by an informed selection of the attributes based on the current state-of-the-art in software engineering and generate a seed solution. We then traverse the resulting space using graph isomorphisms and other set operations over the vertex sets. The new solutions will preserve the desired attributes. The goal of defining an isomorphism based search mechanism is to construct predictors of evolution that can facilitate the automation of ’software factory’ paradigm. The model allows for automation via software tools implementing the concepts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new class of parameter estimation algorithms is introduced for Gaussian process regression (GPR) models. It is shown that the integration of the GPR model with probability distance measures of (i) the integrated square error and (ii) Kullback–Leibler (K–L) divergence are analytically tractable. An efficient coordinate descent algorithm is proposed to iteratively estimate the kernel width using golden section search which includes a fast gradient descent algorithm as an inner loop to estimate the noise variance. Numerical examples are included to demonstrate the effectiveness of the new identification approaches.