817 resultados para Error of measurement


Relevância:

90.00% 90.00%

Publicador:

Resumo:

An improved technique for 3D head tracking under varying illumination conditions is proposed. The head is modeled as a texture mapped cylinder. Tracking is formulated as an image registration problem in the cylinder's texture map image. The resulting dynamic texture map provides a stabilized view of the face that can be used as input to many existing 2D techniques for face recognition, facial expressions analysis, lip reading, and eye tracking. To solve the registration problem in the presence of lighting variation and head motion, the residual error of registration is modeled as a linear combination of texture warping templates and orthogonal illumination templates. Fast and stable on-line tracking is achieved via regularized, weighted least squares minimization of the registration error. The regularization term tends to limit potential ambiguities that arise in the warping and illumination templates. It enables stable tracking over extended sequences. Tracking does not require a precise initial fit of the model; the system is initialized automatically using a simple 2D face detector. The only assumption is that the target is facing the camera in the first frame of the sequence. The formulation is tailored to take advantage of texture mapping hardware available in many workstations, PC's, and game consoles. The non-optimized implementation runs at about 15 frames per second on a SGI O2 graphic workstation. Extensive experiments evaluating the effectiveness of the formulation are reported. The sensitivity of the technique to illumination, regularization parameters, errors in the initial positioning and internal camera parameters are analyzed. Examples and applications of tracking are reported.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We introduce a method for recovering the spatial and temporal alignment between two or more views of objects moving over a ground plane. Existing approaches either assume that the streams are globally synchronized, so that only solving the spatial alignment is needed, or that the temporal misalignment is small enough so that exhaustive search can be performed. In contrast, our approach can recover both the spatial and temporal alignment. We compute for each trajectory a number of interesting segments, and we use their description to form putative matches between trajectories. Each pair of corresponding interesting segments induces a temporal alignment, and defines an interval of common support across two views of an object that is used to recover the spatial alignment. Interesting segments and their descriptors are defined using algebraic projective invariants measured along the trajectories. Similarity between interesting segments is computed taking into account the statistics of such invariants. Candidate alignment parameters are verified checking the consistency, in terms of the symmetric transfer error, of all the putative pairs of corresponding interesting segments. Experiments are conducted with two different sets of data, one with two views of an outdoor scene featuring moving people and cars, and one with four views of a laboratory sequence featuring moving radio-controlled cars.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

As a prominent form of land use across much of upland Europe, extensive livestock grazing may hold the key to the sustainable management of these landscapes. Recent agricultural policy reform, however, has resulted in a decline in upland sheep numbers, prompting concern for the biodiversity value of these areas. This study quantifies the effects of varying levels of grazing management on plant, ground beetle and breeding bird diversity and assemblage in the uplands and lowlands of hill sheep farms in County Kerry, Ireland. Farms represent a continuum of light to heavy grazing, measured using a series of field indicators across several habitats, such as the internationally important blanket bog, home to the ground beetle, Carabus clatratus. Linear mixed effects modelling and non-metric multidimensional scaling are employed to disentangle the most influential management and environmental factors. Grazing state may be determined by the presence of Molinia caerulea or Nardus stricta, and variables such as % traditional ewes, % vegetation litter and % scrub prove valuable indicators of diversity. Measures of ecosystem functioning, e.g. plant biomass (nutrient cycling) and % vegetation cover (erosion rates) are influenced by plant diversity, which is influenced by grazing management. Levels of the ecosystem service, soil organic carbon, vary with ground beetle abundance and diversity, potentially influencing carbon sequestration and thereby climate change. The majority of species from all three taxa are found in the lowlands, with the exception of birds such as meadow pipit and skylark. The scale of measurement should be determined by the size and mobility of the species in question. The challenge is to manage these high nature value landscapes using agri-environment schemes which enhance biodiversity by maintaining structural heterogeneity across a range of scales, altitudes and habitats whilst integrating the decisions of people living and working in these marginal areas.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The macaque frontal eye field (FEF) is involved in the generation of saccadic eye movements and fixations. To better understand the role of the FEF, we reversibly inactivated a portion of it while a monkey made saccades and fixations in response to visual stimuli. Lidocaine was infused into a FEF and neural inactivation was monitored with a nearby microelectrode. We used two saccadic tasks. In the delay task, a target was presented and then extinguished, but the monkey was not allowed to make a saccade to its location until a cue to move was given. In the step task, the monkey was allowed to look at a target as soon as it appeared. During FEF inactivation, monkeys were severely impaired at making saccades to locations of extinguished contralateral targets in the delay task. They were similarly impaired at making saccades to locations of contralateral targets in the step task if the target was flashed for < or =100 ms, such that it was gone before the saccade was initiated. Deficits included increases in saccadic latency, increases in saccadic error, and increases in the frequency of trials in which a saccade was not made. We varied the initial fixation location and found that the impairment specifically affected contraversive saccades rather than affecting all saccades made into head-centered contralateral space. Monkeys were impaired only slightly at making saccades to contralateral targets in the step task if the target duration was 1000 ms, such that the target was present during the saccade: latency increased, but increases in saccadic error were mild and increases in the frequency of trials in which a saccade was not made were insignificant. During FEF inactivation there usually was a direct correlation between the latency and the error of saccades made in response to contralateral targets. In the delay task, FEF inactivation increased the frequency of making premature saccades to ipsilateral targets. FEF inactivation had inconsistent and mild effects on saccadic peak velocity. FEF inactivation caused impairments in the ability to fixate lights steadily in contralateral space. FEF inactivation always caused an ipsiversive deviation of the eyes in darkness. In summary, our results suggest that the FEF plays major roles in (1) generating contraversive saccades to locations of extinguished or flashed targets, (2) maintaining contralateral fixations, and (3) suppressing inappropriate ipsiversive saccades.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: Hepatorenal tyrosinaemia (Tyr 1) is a rare inborn error of tyrosine metabolism. Without treatment, patients are at high risk of developing acute liver failure, renal dysfunction and in the long run hepatocellular carcinoma. The aim of our study was to collect cross-sectional data. Methods. Via questionnaires we collected retrospective data of 168 patients with Tyr 1 from 21 centres (Europe, Turkey and Israel) about diagnosis, treatment, monitoring and outcome. In a subsequent consensus workshop, we discussed data and clinical implications. Results: Early treatment by NTBC accompanied by diet is essential to prevent serious complications such as liver failure, hepatocellular carcinoma and renal disease. As patients may remain initially asymptomatic or develop uncharacteristic clinical symptoms in the first months of life newborn mass screening using succinylacetone (SA) as a screening parameter in dried blood is mandatory for early diagnosis. NTBC-treatment has to be combined with natural protein restriction supplemented with essential amino acids. NTBC dosage should be reduced to the minimal dose allowing metabolic control, once daily dosing may be an option in older children and adults in order to increase compliance. Metabolic control is judged by SA (below detection limit) in dried blood or urine, plasma tyrosine (<400 μM) and NTBC-levels in the therapeutic range (20-40 μM). Side effects of NTBC are mild and often transient. Indications for liver transplantation are hepatocellular carcinoma or failure to respond to NTBC. Follow-up procedures should include liver and kidney function tests, tumor markers and imaging, ophthalmological examination, blood count, psychomotor and intelligence testing as well as therapeutic monitoring (SA, tyrosine, NTBC in blood). Conclusion: Based on the data from 21 centres treating 168 patients we were able to characterize current practice and clinical experience in Tyr 1. This information could form the basis for clinical practice recommendations, however further prospective data are required to underpin some of the recommendations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present a new way of extracting policy positions from political texts that treats texts not as discourses to be understood and interpreted but rather, as data in the form of words. We compare this approach to previous methods of text analysis and use it to replicate published estimates of the policy positions of political parties in Britain and Ireland, on both economic and social policy dimensions. We “export” the method to a non-English-language environment, analyzing the policy positions of German parties, including the PDS as it entered the former West German party system. Finally, we extend its application beyond the analysis of party manifestos, to the estimation of political positions from legislative speeches. Our “language-blind” word scoring technique successfully replicates published policy estimates without the substantial costs of time and labor that these require. Furthermore, unlike in any previous method for extracting policy positions from political texts, we provide uncertainty measures for our estimates, allowing analysts to make informed judgments of the extent to which differences between two estimated policy positions can be viewed as significant or merely as products of measurement error.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The University of Waikato, Hamilton, New Zealand and The Queen's University of Belfast, Northern Ireland radiocarbon dating laboratories have undertaken a series of high-precision measurements on decadal samples of dendrochronologically dated oak (Quercus petraea) from Great Britain and cedar (Libocedrus bidwillii) and silver pine (Lagarostrobos colensoi) from New Zealand. The results show an average hemispheric offset over the 900 yr of measurement of 40±13 yr. This value is not constant but varies with a periodicity of about 130 yr. The Northern Hemisphere measurements confirm the validity of the Pearson et al. (1986) calibration dataset.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The work presented is concerned with the estimation of manufacturing cost at the concept design stage, when little technical information is readily available. The work focuses on the nose cowl sections of a wide range of engine nacelles built at Bombardier Aerospace Shorts of Belfast. A core methodology is presented that: defines manufacturing cost elements that are prominent; utilises technical parameters that are highly influential in generating those costs; establishes the linkage between these two; and builds the associated cost estimating relations into models. The methodology is readily adapted to deal with both the early and more mature conceptual design phases, which thereby highlights the generic, flexible and fundamental nature of the method. The early concept cost model simplifies cost as a cumulative element that can be estimated using higher level complexity ratings, while the mature concept cost model breaks manufacturing cost down into a number of constituents that are each driven by their own specific drivers. Both methodologies have an average error of less that ten percent when correlated with actual findings, thus achieving an acceptable level of accuracy. By way of validity and application, the research is firmly based on industrial case studies and practice and addresses the integration of design and manufacture through cost. The main contribution of the paper is the cost modelling methodology. The elemental modelling of the cost breakdown structure through materials, part fabrication, assembly and their associated drivers is relevant to the analytical design procedure, as it utilises design definition and complexity that is understood by engineers.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

For the purpose of a nonlocality test, we propose a general correlation observable of two parties by utilizing local d- outcome measurements with SU(d) transformations and classical communications. Generic symmetries of the SU(d) transformations and correlation observables are found for the test of nonlocality. It is shown that these symmetries dramatically reduce the number of numerical variables, which is important for numerical analysis of nonlocality. A linear combination of the correlation observables, which is reduced to the Clauser- Home-Shimony-Holt (CHSH) Bell's inequality for two outcome measurements, leads to the Collins-Gisin-Linden-Massar-Popescu (CGLMP) nonlocality test for d-outcome measurement. As a system to be tested for its nonlocality, we investigate a continuous- variable (CV) entangled state with d measurement outcomes. It allows the comparison of nonlocality based on different numbers of measurement outcomes on one physical system. In our example of the CV state, we find that a pure entangled state of any degree violates Bell's inequality for d(greater than or equal to2) measurement outcomes when the observables are of SU(d) transformations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Abstract: Raman spectroscopy has been used for the first time to predict the FA composition of unextracted adipose tissue of pork, beef, lamb, and chicken. It was found that the bulk unsaturation parameters could be predicted successfully [R-2 = 0.97, root mean square error of prediction (RMSEP) = 4.6% of 4 sigma], with cis unsaturation, which accounted for the majority of the unsaturation, giving similar correlations. The combined abundance of all measured PUFA (>= 2 double bonds per chain) was also well predicted with R-2 = 0.97 and RMSEP = 4.0% of 4 sigma. Trans unsaturation was not as well modeled (R-2 = 0.52, RMSEP = 18% of 4 sigma); this reduced prediction ability can be attributed to the low levels of trans FA found in adipose tissue (0.035 times the cis unsaturation level). For the individual FA, the average partial least squares (PLS) regression coefficient of the 18 most abundant FA (relative abundances ranging from 0.1 to 38.6% of the total FA content) was R-2 = 0.73; the average RMSEP = 11.9% of 4 sigma. Regression coefficients and prediction errors for the five most abundant FA were all better than the average value (in some cases as low as RMSEP = 4.7% of 4 sigma). Cross-correlation between the abundances of the minor FA and more abundant acids could be determined by principal component analysis methods, and the resulting groups of correlated compounds were also well-predicted using PLS. The accuracy of the prediction of individual FA was at least as good as other spectroscopic methods, and the extremely straightforward sampling method meant that very rapid analysis of samples at ambient temperature was easily achieved. This work shows that Raman profiling of hundreds of samples per day is easily achievable with an automated sampling system.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The results of a study aimed at determining the most important experimental parameters for automated, quantitative analysis of solid dosage form pharmaceuticals (seized and model 'ecstasy' tablets) are reported. Data obtained with a macro-Raman spectrometer were complemented by micro-Raman measurements, which gave information on particle size and provided excellent data for developing statistical models of the sampling errors associated with collecting data as a series of grid points on the tablets' surface. Spectra recorded at single points on the surface of seized MDMA-caffeine-lactose tablets with a Raman microscope (lambda(ex) = 785 nm, 3 mum diameter spot) were typically dominated by one or other of the three components, consistent with Raman mapping data which showed the drug and caffeine microcrystals were ca 40 mum in diameter. Spectra collected with a microscope from eight points on a 200 mum grid were combined and in the resultant spectra the average value of the Raman band intensity ratio used to quantify the MDMA: caffeine ratio, mu(r), was 1.19 with an unacceptably high standard deviation, sigma(r), of 1.20. In contrast, with a conventional macro-Raman system (150 mum spot diameter), combined eight grid point data gave mu(r) = 1.47 with sigma(r) = 0.16. A simple statistical model which could be used to predict sigma(r) under the various conditions used was developed. The model showed that the decrease in sigma(r) on moving to a 150 mum spot was too large to be due entirely to the increased spot diameter but was consistent with the increased sampling volume that arose from a combination of the larger spot size and depth of focus in the macroscopic system. With the macro-Raman system, combining 64 grid points (0.5 mm spacing and 1-2 s accumulation per point) to give a single averaged spectrum for a tablet was found to be a practical balance between minimizing sampling errors and keeping overhead times at an acceptable level. The effectiveness of this sampling strategy was also tested by quantitative analysis of a set of model ecstasy tablets prepared from MDEA-sorbitol (0-30% by mass MDEA). A simple univariate calibration model of averaged 64 point data had R-2 = 0.998 and an r.m.s. standard error of prediction of 1.1% whereas data obtained by sampling just four points on the same tablet showed deviations from the calibration of up to 5%.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The potential of Raman spectroscopy for the determination of meat quality attributes has been investigated using data from a set of 52 cooked beef samples, which were rated by trained taste panels. The Raman spectra, shear force and cooking loss were measured and PLS used to correlate the attributes with the Raman data. Good correlations and standard errors of prediction were found when the Raman data were used to predict the panels' rating of acceptability of texture (R-2 = 0.71, Residual Mean Standard Error of Prediction (RMSEP)% of the mean (mu) = 15%), degree of tenderness (R-2 = 0.65, RMSEP% of mu = 18%), degree of juiciness (R-2 = 0.62, RMSEP% of mu = 16%), and overall acceptability (R-2 = 0.67, RMSEP% of mu = 11%). In contrast, the mechanically determined shear force was poorly correlated with tenderness (R-2 = 0.15). Tentative interpretation of the plots of the regression coefficients suggests that the alpha-helix to beta-sheet ratio of the proteins and the hydrophobicity of the myofibrillar environment are important factors contributing to the shear force, tenderness, texture and overall acceptability of the beef. In summary, this work demonstrates that Raman spectroscopy can be used to predict consumer-perceived beef quality. In part, this overall success is due to the fact that the Raman method predicts texture and tenderness, which are the predominant factors in determining overall acceptability in the Western world. Nonetheless, it is clear that Raman spectroscopy has considerable potential as a method for non-destructive and rapid determination of beef quality parameters.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Extending the work presented in Prasad et al. (IEEE Proceedings on Control Theory and Applications, 147, 523-37, 2000), this paper reports a hierarchical nonlinear physical model-based control strategy to account for the problems arising due to complex dynamics of drum level and governor valve, and demonstrates its effectiveness in plant-wide disturbance handling. The strategy incorporates a two-level control structure consisting of lower-level conventional PI regulators and a higher-level nonlinear physical model predictive controller (NPMPC) for mainly set-point manoeuvring. The lower-level PI loops help stabilise the unstable drum-boiler dynamics and allow faster governor valve action for power and grid-frequency regulation. The higher-level NPMPC provides an optimal load demand (or set-point) transition by effective handling of plant-wide interactions and system disturbances. The strategy has been tested in a simulation of a 200-MW oil-fired power plant at Ballylumford in Northern Ireland. A novel approach is devized to test the disturbance rejection capability in severe operating conditions. Low frequency disturbances were created by making random changes in radiation heat flow on the boiler-side, while condenser vacuum was fluctuating in a random fashion on the turbine side. In order to simulate high-frequency disturbances, pulse-type load disturbances were made to strike at instants which are not an integral multiple of the NPMPC sampling period. Impressive results have been obtained during both types of system disturbances and extremely high rates of load changes, right across the operating range, These results compared favourably with those from a conventional state-space generalized predictive control (GPC) method designed under similar conditions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Index properties such as the liquid limit and plastic limit are widely used to evaluate certain geotechnical parameters of fine-grained soils. Measurement of the liquid limit is a mechanical process, and the possibility of errors occurring during measurement is not significant. However, this is not the case for plastic limit testing, despite the fact that the current method of measurement is embraced by many standards around the world. The method in question relies on a fairly crude procedure known widely as the ‘thread rolling' test, though it has been the subject of much criticism in recent years. It is essential that a new, more reliable method of measuring the plastic limit is developed using a mechanical process that is both consistent and easily reproducible. The work reported in this paper concerns the development of a new device to measure the plastic limit, based on the existing falling cone apparatus. The force required for the test is equivalent to the application of a 54 N fast-static load acting on the existing cone used in liquid limit measurements. The test is complete when the relevant water content of the soil specimen allows the cone to achieve a penetration of 20 mm. The new technique was used to measure the plastic limit of 16 different clays from around the world. The plastic limit measured using the new method identified reasonably well the water content at which the soil phase changes from the plastic to the semi-solid state. Further evaluation was undertaken by conducting plastic limit tests using the new method on selected samples and comparing the results with values reported by local site investigation laboratories. Again, reasonable agreement was found.