93 resultados para Error of measurement


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Computer Experiments, consisting of a number of runs of a computer model with different inputs, are now common-place in scientific research. Using a simple fire model for illustration some guidelines are given for the size of a computer experiment. A graph is provided relating the error of prediction to the sample size which should be of use when designing computer experiments. Methods for augmenting computer experiments with extra runs are also described and illustrated. The simplest method involves adding one point at a time choosing that point with the maximum prediction variance. Another method that appears to work well is to choose points from a candidate set with maximum determinant of the variance covariance matrix of predictions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In Australia and increasingly worldwide, methamphetamine is one of the most commonly seized drugs analysed by forensic chemists. The current well-established GC/MS methods used to identify and quantify methamphetamine are lengthy, expensive processes, but often rapid analysis is requested by undercover police leading to an interest in developing this new analytical technique. Ninety six illicit drug seizures containing methamphetamine (0.1% - 78.6%) were analysed using Fourier Transform Infrared Spectroscopy with an Attenuated Total Reflectance attachment and Chemometrics. Two Partial Least Squares models were developed, one using the principal Infrared Spectroscopy peaks of methamphetamine and the other a Hierarchical Partial Least Squares model. Both of these models were refined to choose the variables that were most closely associated with the methamphetamine % vector. Both of the models were excellent, with the principal peaks in the Partial Least Squares model having Root Mean Square Error of Prediction 3.8, R2 0.9779 and lower limit of quantification 7% methamphetamine. The Hierarchical Partial Least Squares model had lower limit of quantification 0.3% methamphetamine, Root Mean Square Error of Prediction 5.2 and R2 0.9637. Such models offer rapid and effective methods for screening illicit drug samples to determine the percentage of methamphetamine they contain.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Children are encountering more and more graphic representations of data in their learning and everyday life. Much of this data occurs in quantitative forms as different forms of measurement are incorporated into the graphics during their construction. In their formal education, children are required to learn to use a range of these quantitative representations in subjects across the school curriculum. Previous research that focuses on the use of information processing and traditional approaches to cognitive psychology concludes that the development of an understanding of such representations of data is a complex process. An alternative approach is to investigate the experiences of children as they interact with graphic representations of quantitative data in their own life-worlds. This paper demonstrates how a phenomenographic approach may be used to reveal the qualitatively different ways in which children in Australian primary and secondary education understand the phenomenon of graphic representations of quantitative data. Seven variations of the children’s understanding were revealed. These have been described interpretively in the article and confirmed through the words of the children. A detailed outcome space demonstrates how these seven variations are structurally related.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Meal-Induced Thermogenesis (MIT) research findings are highly inconsistent, in part, due to the variety of durations and protocols used to measure MIT. We aimed to determine: 1) the proportion of a 6 h MIT response completed at 3, 4 and 5 h; 2) the associations between the shorter durations and the 6 h measure; 3) whether shorter durations improved the reproducibility of the measurement. MIT was measured in response to a 2410 KJ mixed composition meal in ten individuals (5 male, 5 female) on two occasions. Energy expenditure was measured continuously for 6 h post-meal using indirect calorimetry and MIT was calculated as the increase in energy expenditure above the pre-meal RMR. On average, 76%, 89%, and 96% of the 6 h MIT response was completed within 3, 4 and 5 h respectively, and the MIT at each of these time points was strongly correlated to the 6 h MIT (range for correlations, r = 0.990 to 0.998; p < 0.01). The between-day CV for the 6 h measurement was 33%, but was significantly lower after 3 h of measurement (CV = 26%, p = 0.02). Despite variability in the total MIT between days, the proportion of the MIT that was complete at 3, 4 and 5 h was reproducible (mean CV: 5%). While 6 h is typically required to measure the complete MIT response, 3 h measures provide sufficient information about the magnitude of the MIT response and may be applicable for measuring individuals on repeated occasions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis establishes performance properties for approximate filters and controllers that are designed on the basis of approximate dynamic system representations. These performance properties provide a theoretical justification for the widespread application of approximate filters and controllers in the common situation where system models are not known with complete certainty. This research also provides useful tools for approximate filter designs, which are applied to hybrid filtering of uncertain nonlinear systems. As a contribution towards applications, this thesis also investigates air traffic separation control in the presence of measurement uncertainties.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Lean body mass (LBM) and muscle mass remains difficult to quantify in large epidemiological studies due to non-availability of inexpensive methods. We therefore developed anthropometric prediction equations to estimate the LBM and appendicular lean soft tissue (ALST) using dual energy X-ray absorptiometry (DXA) as a reference method. Healthy volunteers (n= 2220; 36% females; age 18-79 y) representing a wide range of body mass index (14-44 kg/m2) participated in this study. Their LBM including ALST was assessed by DXA along with anthropometric measurements. The sample was divided into prediction (60%) and validation (40%) sets. In the prediction set, a number of prediction models were constructed using DXA measured LBM and ALST estimates as dependent variables and a combination of anthropometric indices as independent variables. These equations were cross-validated in the validation set. Simple equations using age, height and weight explained > 90% variation in the LBM and ALST in both men and women. Additional variables (hip and limb circumferences and sum of SFTs) increased the explained variation by 5-8% in the fully adjusted models predicting LBM and ALST. More complex equations using all the above anthropometric variables could predict the DXA measured LBM and ALST accurately as indicated by low standard error of the estimate (LBM: 1.47 kg and 1.63 kg for men and women, respectively) as well as good agreement by Bland Altman analyses. These equations could be a valuable tool in large epidemiological studies assessing these body compartments in Indians and other population groups with similar body composition.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Development of design guides to estimate the difference in speech interference level due to road traffic noise between a reference position and balcony position or façade position is explored. A previously established and validated theoretical model incorporating direct, specular and diffuse reflection paths is used to create a database of results across a large number of scenarios. Nine balcony types with variable acoustic treatments are assessed to provide acoustic design guidance on optimised selection of balcony acoustic treatments based on location and street type. In total, the results database contains 9720 scenarios on which multivariate linear regression is conducted in order to derive an appropriate design guide equation. The best fit regression derived is a multivariable linear equation including modified exponential equations on each of nine deciding variables, (1) diffraction path difference, (2) ratio of total specular energy to direct energy, (3) distance loss between reference position and receiver position, (4) distance from source to balcony façade, (5) height of balcony floor above street, (6) balcony depth, (7) height of opposite buildings, (8) diffusion coefficient of buildings, and; (9) balcony average absorption. Overall, the regression correlation coefficient, R2, is 0.89 with 95% confidence standard error of ±3.4 dB.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Personal ultraviolet dosimeters have been used in epidemiological studies to understand the risks and benefits of individuals' exposure to solar ultraviolet radiation (UVR). We investigated the types and determinants of non-compliance associated with a protocol for use of polysulphone UVR dosimeters. In the AusD Study, 1,002 Australian adults (aged 18-75 years) were asked to wear a new dosimeter on their wrist each day for 10 consecutive days to quantify their daily exposure to solar UVR. Of the 10,020 dosimeters distributed, 296 (3%) were not returned or used (Type I non-compliance) and other usage errors were reported for 763 (8%) returned dosimeters (Type II non-compliance). Type I errors were more common in participants with predominantly outdoor occupations. Type II errors were reported more frequently on the first day of measurement; weekend days or rainy days; and among females; younger people; more educated participants or those with outdoor occupations. Half (50%) the participants reported a non-compliance error on at least one day during the 10-day period. However, 92% of participants had at least 7 days of usable data without any apparent non-compliance issues. The factors identified should be considered when designing future UVR dosimetry studies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Recent scholarship has considered the implications of the rise of voluntary private standards in food and the role of private actors in a rapidly evolving, de-facto ‘mandatory’ sphere of governance. Standards are an important element of this globalising private sphere, but are an element that has been relatively peripheral in analyses of power in agri-food systems. Sociological thought has countered orthodox views of standards as simple tools of measurement, instead understanding their function as a governance mechanism that transforms many things, and people, during processes of standardisation. In a case study of the Australian retail supermarket duopoly and the proprietary standards required for market access this paper foregrounds retailers as standard owners and their role in third-party auditing and certification. Interview data from primary research into Australia’s food standards captures the multifaceted role supermarkets play as standard-owners, who are found to impinge on the independence of third-party certification while enforcing rigorous audit practices. We show how standard owners, in attempting to standardize the audit process, generate tensions within certification practices in a unique example of ritualism around audit. In examining standards to understand power in contemporary food governance, it is shown that retailers are drawn beyond standard-setting into certification and enforcement, that is characterized by a web of institutions and actors whose power to influence outcomes is uneven.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Many model-based investigation techniques, such as sensitivity analysis, optimization, and statistical inference, require a large number of model evaluations to be performed at different input and/or parameter values. This limits the application of these techniques to models that can be implemented in computationally efficient computer codes. Emulators, by providing efficient interpolation between outputs of deterministic simulation models, can considerably extend the field of applicability of such computationally demanding techniques. So far, the dominant techniques for developing emulators have been priors in the form of Gaussian stochastic processes (GASP) that were conditioned with a design data set of inputs and corresponding model outputs. In the context of dynamic models, this approach has two essential disadvantages: (i) these emulators do not consider our knowledge of the structure of the model, and (ii) they run into numerical difficulties if there are a large number of closely spaced input points as is often the case in the time dimension of dynamic models. To address both of these problems, a new concept of developing emulators for dynamic models is proposed. This concept is based on a prior that combines a simplified linear state space model of the temporal evolution of the dynamic model with Gaussian stochastic processes for the innovation terms as functions of model parameters and/or inputs. These innovation terms are intended to correct the error of the linear model at each output step. Conditioning this prior to the design data set is done by Kalman smoothing. This leads to an efficient emulator that, due to the consideration of our knowledge about dominant mechanisms built into the simulation model, can be expected to outperform purely statistical emulators at least in cases in which the design data set is small. The feasibility and potential difficulties of the proposed approach are demonstrated by the application to a simple hydrological model.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Physical activity (PA) parenting research has proliferated over the past decade, with findings verifying the influential role that parents play in children's emerging PA behaviors. This knowledge, however, has not translated into effective family-based PA interventions. During a preconference workshop to the 2012 International Society for Behavioral Nutrition and Physical Activity annual meeting, a PA parenting workgroup met to: (1) Discuss challenges in PA parenting research that may limit its translation, (2) identify explanations or reasons for such challenges, and; (3) recommend strategies for future research. Challenges discussed by the workgroup included a proliferation of disconnected and inconsistently measured constructs, a limited understanding of the dimensions of PA parenting, and a narrow conceptualization of hypothesized moderators of the relationship between PA parenting and child PA. Potential reasons for such challenges emphasized by the group included a disinclination to employ theory when developing measures and examining predictors and outcomes of PA parenting as well as a lack of agreed-upon measurement standards. Suggested solutions focused on the need to link PA parenting research with general parenting research, define and adopt rigorous standards of measurement, and identify new methods to assess PA parenting. As an initial step toward implementing these recommendations, the workgroup developed a conceptual model that: (1) Integrates parenting dimensions from the general parenting literature into the conceptualization of PA parenting, (2) draws on behavioral and developmental theory, and; (3) emphasizes areas which have been neglected to date including precursors to PA parenting and effect modifiers.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

For users of germplasm collections, the purpose of measuring characterization and evaluation descriptors, and subsequently using statistical methodology to summarize the data, is not only to interpret the relationships between the descriptors, but also to characterize the differences and similarities between accessions in relation to their phenotypic variability for each of the measured descriptors. The set of descriptors for the accessions of most germplasm collections consists of both numerical and categorical descriptors. This poses problems for a combined analysis of all descriptors because few statistical techniques deal with mixtures of measurement types. In this article, nonlinear principal component analysis was used to analyze the descriptors of the accessions in the Australian groundnut collection. It was demonstrated that the nonlinear variant of ordinary principal component analysis is an appropriate analytical tool because subspecies and botanical varieties could be identified on the basis of the analysis and characterized in terms of all descriptors. Moreover, outlying accessions could be easily spotted and their characteristics established. The statistical results and their interpretations provide users with a more efficient way to identify accessions of potential relevance for their plant improvement programs and encourage and improve the usefulness and utilization of germplasm collections.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Self-reported health status measures are generally used to analyse Social Security Disability Insurance's (SSDI) application and award decisions as well as the relationship between its generosity and labour force participation. Due to endogeneity and measurement error, the use of self-reported health and disability indicators as explanatory variables in economic models is problematic. We employ county-level aggregate data, instrumental variables and spatial econometric techniques to analyse the determinants of variation in SSDI rates and explicitly account for the endogeneity and measurement error of the self-reported disability measure. Two surprising results are found. First, it is shown that measurement error is the dominating source of the bias and that the main source of measurement error is sampling error. Second, results suggest that there may be synergies for applying for SSDI when the disabled population is larger. © 2011 Taylor & Francis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper examines the evaluation of BIM-enabled projects. It provides a critical review of the three main areas of measurement, namely technology, organization/people and process. Using two documented case studies of BIM implementation, the paper illustrates the benefits realized by project owners and contractors, and illustrates a lack of attention relative to contextual factors affecting the adoption and deployment of BIM. The paper has three main contributions. First, it identifies and discusses the lack of and difficulty surrounding standardized assessment methods for evaluating BIM-enabled projects. Second, it proposes a conceptual model that includes contextual attributes and demonstrates how the proposed framework reaches beyond simple evaluation to encompass the documentation of BIM’s benefits, lessons learned, challenges and adopted solutions. Third, it shows how the framework can account for existing business processes, organizational process assets, and enterprise level factors. The paper aims to provide a conceptual basis for evaluation and a starting point for benchmarking.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper investigates compressed sensing using hidden Markov models (HMMs) and hence provides an extension of recent single frame, bounded error sparse decoding problems into a class of sparse estimation problems containing both temporal evolution and stochastic aspects. This paper presents two optimal estimators for compressed HMMs. The impact of measurement compression on HMM filtering performance is experimentally examined in the context of an important image based aircraft target tracking application. Surprisingly, tracking of dim small-sized targets (as small as 5-10 pixels, with local detectability/SNR as low as − 1.05 dB) was only mildly impacted by compressed sensing down to 15% of original image size.