124 resultados para Decomposition techniques


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study assesses the current state of adult skeletal age-at-death estimation in biological anthropology through analysis of data published in recent research articles from three major anthropological and archaeological journals (2004–2009). The most commonly used adult ageing methods, age of ‘adulthood’, age ranges and the maximum age reported for ‘mature’ adults were compared. The results showed a wide range of variability in the age at which individuals were determined to be adult (from 14 to 25 years), uneven age ranges, a lack of standardisation in the use of descriptive age categories and the inappropriate application of some ageing methods for the sample being examined. Such discrepancies make comparisons between skeletal samples difficult, while the inappropriate use of some techniques make the resultant age estimations unreliable. At a time when national and even global comparisons of past health are becoming prominent, standardisation in the terminology and age categories used to define adults within each sample is fundamental. It is hoped that this research will prompt discussions in the osteological community (both nationally and internationally) about what defines an ‘adult’, how to standardise the age ranges that we use and how individuals should be assigned to each age category. Skeletal markers have been proposed to help physically identify ‘adult’ individuals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the last few years a state-space formulation has been introduced into self-tuning control. This has not only allowed for a wider choice of possible control actions, but has also provided an insight into the theory underlying—and hidden by—that used in the polynomial description. This paper considers many of the self-tuning algorithms, both state-space and polynomial, presently in use, and by starting from first principles develops the observers which are, effectively, used in each case. At any specific time instant the state estimator can be regarded as taking one of two forms. In the first case the most recently available output measurement is excluded, and here an optimal and conditionally stable observer is obtained. In the second case the present output signal is included, and here it is shown that although the observer is once again conditionally stable, it is no longer optimal. This result is of significance, as many of the popular self-tuning controllers lie in the second, rather than first, category.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Earth-directed coronal mass ejection (CME) of 8 April 2010 provided an opportunity for space weather predictions from both established and developmental techniques to be made from near–real time data received from the SOHO and STEREO spacecraft; the STEREO spacecraft provide a unique view of Earth-directed events from outside the Sun-Earth line. Although the near–real time data transmitted by the STEREO Space Weather Beacon are significantly poorer in quality than the subsequently downlinked science data, the use of these data has the advantage that near–real time analysis is possible, allowing actual forecasts to be made. The fact that such forecasts cannot be biased by any prior knowledge of the actual arrival time at Earth provides an opportunity for an unbiased comparison between several established and developmental forecasting techniques. We conclude that for forecasts based on the STEREO coronagraph data, it is important to take account of the subsequent acceleration/deceleration of each CME through interaction with the solar wind, while predictions based on measurements of CMEs made by the STEREO Heliospheric Imagers would benefit from higher temporal and spatial resolution. Space weather forecasting tools must work with near–real time data; such data, when provided by science missions, is usually highly compressed and/or reduced in temporal/spatial resolution and may also have significant gaps in coverage, making such forecasts more challenging.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper introduces a new neurofuzzy model construction algorithm for nonlinear dynamic systems based upon basis functions that are Bezier-Bernstein polynomial functions. This paper is generalized in that it copes with n-dimensional inputs by utilising an additive decomposition construction to overcome the curse of dimensionality associated with high n. This new construction algorithm also introduces univariate Bezier-Bernstein polynomial functions for the completeness of the generalized procedure. Like the B-spline expansion based neurofuzzy systems, Bezier-Bernstein polynomial function based neurofuzzy networks hold desirable properties such as nonnegativity of the basis functions, unity of support, and interpretability of basis function as fuzzy membership functions, moreover with the additional advantages of structural parsimony and Delaunay input space partition, essentially overcoming the curse of dimensionality associated with conventional fuzzy and RBF networks. This new modeling network is based on additive decomposition approach together with two separate basis function formation approaches for both univariate and bivariate Bezier-Bernstein polynomial functions used in model construction. The overall network weights are then learnt using conventional least squares methods. Numerical examples are included to demonstrate the effectiveness of this new data based modeling approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The precision of quasioptical null-balanced bridge instruments for transmission and reflection coefficient measurements at millimeter and submillimeter wavelengths is analyzed. A Jones matrix analysis is used to describe the amount of power reaching the detector as a function of grid angle orientation, sample transmittance/reflectance and phase delay. An analysis is performed of the errors involved in determining the complex transmission and reflection coefficient after taking into account the quantization error in the grid angle and micrometer readings, the transmission or reflection coefficient of the sample, the noise equivalent power of the detector, the source power and the post-detection bandwidth. For a system fitted with a rotating grid with resolution of 0.017 rad and a micrometer quantization error of 1 μm, a 1 mW source, and a detector with a noise equivalent power 5×10−9 W Hz−1/2, the maximum errors at an amplitude transmission or reflection coefficient of 0.5 are below ±0.025.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of n-tuple or weightless neural networks as pattern recognition devices has been well documented. They have a significant advantages over more common networks paradigms, such as the multilayer perceptron in that they can be easily implemented in digital hardware using standard random access memories. To date, n-tuple networks have predominantly been used as fast pattern classification devices. The paper describes how n-tuple techniques can be used in the hardware implementation of a general auto-associative network.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A technique is derived for solving a non-linear optimal control problem by iterating on a sequence of simplified problems in linear quadratic form. The technique is designed to achieve the correct solution of the original non-linear optimal control problem in spite of these simplifications. A mixed approach with a discrete performance index and continuous state variable system description is used as the basis of the design, and it is shown how the global problem can be decomposed into local sub-system problems and a co-ordinator within a hierarchical framework. An analysis of the optimality and convergence properties of the algorithm is presented and the effectiveness of the technique is demonstrated using a simulation example with a non-separable performance index.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses the application of model reference adaptive control concepts to the automatic tuning of PID controllers. The effectiveness of the proposed method is shown through simulated applications. The gradient approach and simulated examples are provided.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In situ analysis has become increasingly important for contaminated land investigation and remediation. At present, portable techniques are used mainly as scanning tools to assess the spread and magnitude of the contamination, and are an adjunct to conventional laboratory analyses. A site in Cornwall, containing naturally occurring radioactive material (NORM), provided an opportunity for Reading University PhD student Anna Kutner to compare analytical data collected in situ with data generated by laboratory-based methods. The preliminary results in this paper extend the author‟s poster presentation at last September‟s GeoSpec2010 conference held in Lancaster.