943 resultados para technical error of measurement


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The concept of a “true” ground-truth map is introduced, from which the inaccuracy/error of any production map may be measured. A partition of the mapped region is defined in terms of the “residual rectification” transformation. Geometric RMS-type and Geometric Distortion error criteria are defined as well as a map mis-classification error criterion (the latter for hard and fuzzy produc-tion maps). The total map error is defined to be the sum (over each set of the map partition men-tioned above) of these three error components integrated over each set of the partition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Investigations of Li-7(p,n)Be-7 reactions using Cu and CH primary and LiF secondary targets were performed using the VULCAN laser [C.N. Danson , J. Mod. Opt. 45, 1653 (1997)] with intensities up to 3x10(19) W cm(-2). The neutron yield was measured using CR-39 plastic track detector and the yield was up to 3x10(8) sr(-1) for CH primary targets and up to 2x10(8) sr(-1) for Cu primary targets. The angular distribution of neutrons was measured at various angles and revealed a relatively anisotropic neutron distribution over 180degrees that was greater than the error of measurement. It may be possible to exploit such reactions on high repetition, table-top lasers for neutron radiography. (C) 2004 American Institute of Physics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Turbocompounding is the process of recovering a proportion of an engine’s fuel energy that would otherwise be lost in the exhaust process and adding it to the output power. This was first seen in the 1930s and is carried out by coupling an exhaust gas turbine to the crankshaft of a reciprocating engine. It has since been recognised that coupling the power turbine to an electrical generator instead of the crankshaft has the potential to reduce the fuel consumption further with the added flexibility of being able to decide how this recovered energy is used. The electricity generated can be used in automotive applications to assist the crankshaft using a flywheel motor generator or to power ancillaries that would otherwise have run off the crankshaft. In the case of stationary power plants, it can assist the electrical power output. Decoupling the power turbine from the crankshaft and coupling it to a generator allows the power electronics to control the turbine speed independently in order to optimise the specific fuel consumption for different engine operating conditions. This method of energy recapture is termed ‘turbogenerating’.

This paper gives a brief history of turbocompounding and its thermodynamic merits. It then moves on to give an account of the validation of a turbogenerated engine model. The model is then used to investigate what needs to be done to an engine when a turbogenerator is installed. The engine being modelled is used for stationary power generation and is fuelled by an induced biogas with a small portion of palm oil being injected into the cylinder to initiate combustion by compression ignition. From these investigations, optimum settings were found that result in a 10.90% improvement in overall efficiency. These savings relate to the same engine without a turbogenerator installed operating with fixed fuelling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Policy-based network management (PBNM) paradigms provide an effective tool for end-to-end resource
management in converged next generation networks by enabling unified, adaptive and scalable solutions
that integrate and co-ordinate diverse resource management mechanisms associated with heterogeneous
access technologies. In our project, a PBNM framework for end-to-end QoS management in converged
networks is being developed. The framework consists of distributed functional entities managed within a
policy-based infrastructure to provide QoS and resource management in converged networks. Within any
QoS control framework, an effective admission control scheme is essential for maintaining the QoS of
flows present in the network. Measurement based admission control (MBAC) and parameter basedadmission control (PBAC) are two commonly used approaches. This paper presents the implementationand analysis of various measurement-based admission control schemes developed within a Java-based
prototype of our policy-based framework. The evaluation is made with real traffic flows on a Linux-based experimental testbed where the current prototype is deployed. Our results show that unlike with classic MBAC or PBAC only schemes, a hybrid approach that combines both methods can simultaneously result in improved admission control and network utilization efficiency

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Previous research demonstrates various associations between depression, cardiovascular disease (CVD) incidence and mortality, possibly as a result of the different methodologies used to measure depression and analyse relationships. This analysis investigated the association between depression, CVD incidence (CVDI) and mortality from CVD (MCVD), smoking related conditions (MSRC), and all causes (MALL), in a sample data set, where depression was measured using items from a validated questionnaire and using items derived from the factor analysis of a larger questionnaire, and analyses were conducted based on continuous data and grouped data.

Methods: Data from the PRIME Study (N=9798 men) on depression and 10-year CVD incidence and mortality were analysed using Cox proportional hazards models.

Results: Using continuous data, both measures of depression resulted in the emergence of positive associations between depression and mortality (MCVD, MSRC, MALL). Using grouped data, however, associations between a validated measure of depression and MCVD, and between a measure of depression derived from factor analysis and all measures of mortality were lost.

Limitations: Low levels of depression, low numbers of individuals with high depression and low numbers of outcome events may limit these analyses, but levels are usual for the population studied.

Conclusions: These data demonstrate a possible association between depression and mortality but detecting this association is dependent on the measurement used and method of analysis. Different findings based on methodology present clear problems for the elucidation and determination of relationships. The differences here argue for the use of validated scales where possible and suggest against over-reduction via factor analysis and grouping.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE:

To determine the test-retest variability in perimetric, optic disc, and macular thickness parameters in a cohort of treated patients with established glaucoma.

PATIENTS AND METHODS:

In this cohort study, the authors analyzed the imaging studies and visual field tests at the baseline and 6-month visits of 162 eyes of 162 participant in the Glaucoma Imaging Longitudinal Study (GILS). They assessed the difference, expressed as the standard error of measurement, of Humphrey field analyzer II (HFA) Swedish Interactive Threshold Algorithm fast, Heidelberg retinal tomograph (HRT) II, and retinal thickness analyzer (RTA) parameters between the two visits and assumed that this difference was due to measurement variability, not pathologic change. A statistically significant change was defined as twice the standard error of measurement.

RESULTS:

In this cohort of treated glaucoma patients, it was found that statistically significant changes were 3.2 dB for mean deviation (MD), 2.2 for pattern standard deviation (PSD), 0.12 for cup shape measure, 0.26 mm for rim area, and 32.8 microm and 31.8 microm for superior and inferior macular thickness, respectively. On the basis of these values, it was estimated that the number of potential progression events detectable in this cohort by the parameters of MD, PSD, cup shape measure, rim area, superior macular thickness, and inferior macular thickness was 7.5, 6.0, 2.3, 5.7, 3.1, and 3.4, respectively.

CONCLUSIONS:

The variability of the measurements of MD, PSD, and rim area, relative to the range of possible values, is less than the variability of cup shape measure or macular thickness measurements. Therefore, the former measurements may be more useful global measurements for assessing progressive glaucoma damage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is mainly concerned with the tracking accuracy of Exchange Traded Funds (ETFs) listed on the London Stock Exchange (LSE) but also evaluates their performance and pricing efficiency. The findings show that ETFs offer virtually the same return but exhibit higher volatility than their benchmark. It seems that the pricing efficiency, which should come from the creation and redemption process, does not fully hold as equity ETFs show consistent price premiums. The tracking error of the funds is generally small and is decreasing over time. The risk of the ETF, daily price volatility and the total expense ratio explain a large part of the tracking error. Trading volume, fund size, bid-ask spread and average price premium or discount did not have an impact on the tracking error. Finally, it is concluded that market volatility and the tracking error are positively correlated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To date there is no documented procedure to extrapolate findings of an isometric nature to a whole body performance setting. The purpose of this study was to quantify the reliability of perceived exertion to control neuromuscular output during an isometric contraction. 21 varsity athletes completed a maximal voluntary contraction and a 2 min constant force contraction at both the start and end of the study. Between pre and post testing all participants completed a 2 min constant perceived exertion contraction once a day for 4 days. Intra-class correlation coefficient (R=O.949) and standard error of measurement (SEM=5.12 Nm) concluded that the isometric contraction was reliable. Limits of agreement demonstrated only moderate initial reliability, yet with smaller limits towards the end of 4 training sessions. In conclusion, athlete's na"ive to a constant effort isometric contraction will produce reliable and acceptably stable results after 1 familiarization sessions has been completed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective To determine overall, test–retest and inter-rater reliability of posture indices among persons with idiopathic scoliosis. Design A reliability study using two raters and two test sessions. Setting Tertiary care paediatric centre. Participants Seventy participants aged between 10 and 20 years with different types of idiopathic scoliosis (Cobb angle 15 to 60°) were recruited from the scoliosis clinic. Main outcome measures Based on the XY co-ordinates of natural reference points (e.g. eyes) as well as markers placed on several anatomical landmarks, 32 angular and linear posture indices taken from digital photographs in the standing position were calculated from a specially developed software program. Generalisability theory served to estimate the reliability and standard error of measurement (SEM) for the overall, test–retest and inter-rater designs. Bland and Altman's method was also used to document agreement between sessions and raters. Results In the random design, dependability coefficients demonstrated a moderate level of reliability for six posture indices (ϕ = 0.51 to 0.72) and a good level of reliability for 26 posture indices out of 32 (ϕ ≥ 0.79). Error attributable to marker placement was negligible for most indices. Limits of agreement and SEM values were larger for shoulder protraction, trunk list, Q angle, cervical lordosis and scoliosis angles. The most reproducible indices were waist angles and knee valgus and varus. Conclusions Posture can be assessed in a global fashion from photographs in persons with idiopathic scoliosis. Despite the good reliability of marker placement, other studies are needed to minimise measurement errors in order to provide a suitable tool for monitoring change in posture over time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper estimates a translog stochastic production function to examine the determinants of technical efficiency of freshwater prawn farming in Bangladesh. Primary data has been collected using random sampling from 90 farmers of three villages in southwestern Bangladesh. Prawn farming displayed much variability in technical efficiency ranging from 9.50 to 99.94% with mean technical efficiency of 65%, which suggested a substantial 35% of potential output can be recovered by removing inefficiency. For a land scarce country like Bangladesh this gain could help increase income and ensure better livelihood for the farmers. Based on the translog production function specification, farmers could be made scale efficient by providing more input to produce more output. The results suggest that farmers’ education and non-farm income significantly improve efficiency whilst farmers’ training, farm distance from the water canal and involvement in fish farm associations reduces efficiency. Hence, the study proposes strategies such as less involvement in farming-related associations and raising the effective training facilities of the farmers as beneficial adjustments for reducing inefficiency. Moreover, the key policy implication of the analysis is that investment in primary education would greatly improve technical efficiency.