937 resultados para error correction model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Reliability or validity studies are important for the evaluation of measurement error in dietary assessment methods. An approach to validation known as the method of triads uses triangulation techniques to calculate the validity coefficient of a food-frequency questionnaire (FFQ).

Objective:
To assess the validity of an FFQ estimates of carotenoid and vitamin E intake against serum biomarker measurements and weighed food records (WFRs), by applying the method of triads. Design: The study population was a sub-sample of adult participants in a randomised controlled trial of b-carotene and sunscreen in the prevention of skin cancer. Dietary intake was assessed by a self-administered FFQ and a WFR. Nonfasting blood samples were collected and plasma analysed for five carotenoids (a-carotene, b-carotene, b-cryptoxanthin, lutein, lycopene) and vitamin E. Correlation coefficients were calculated between each of the dietary methods and the validity coefficient was calculated using
the method of triads. The 95% confidence intervals for the validity coefficients were estimated using bootstrap sampling.

Results: The validity coefficients of the FFQ were highest for a-carotene (0.85) and lycopene (0.62), followed by b-carotene (0.55) and total carotenoids (0.55), while the lowest validity coefficient was for lutein (0.19). The method of triads could not be used for b-cryptoxanthin and vitamin E, as one of the three underlying correlations was negative.

Conclusions:
Results were similar to other studies of validity using biomarkers and the method of triads. For many dietary factors, the upper limit of the validity coefficients was less than 0.5 and therefore only strong relationships between dietary exposure and disease will be detected.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The complexity of the forging process ensures that there is inherent variability in the geometric shape of a forged part. While knowledge of shape error, comparing the desired versus the measured shape, is significant in measuring part quality the question of more interest is what can this error suggest about the forging process set-up? The first contribution of this paper is to develop a shape error metric which identifies geometric shape differences that occur from a desired forged part. This metric is based on the point distribution deformable model developed in pattern recognition research. The second contribution of this paper is to propose an inverse model that identifies changes in process set-up parameter values by analysing the proposed shape error metric. The metric and inverse models are developed using two sets of simulated hot-forged parts created using two different die pairs (simple and 'M'-shaped die pairs). A neural network is used to classify the shape data into three arbitrarily chosen levels for each parameter and it is accurate to at least 77 per cent in the worst case for the simple die pair data and has an average accuracy of approximately 80 per cent when classifying the more complex 'M'-shaped die pair data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The communication via email is one of the most popular services of the Internet. Emails have brought us great convenience in our daily work and life. However, unsolicited messages or spam, flood our email boxes, which results in bandwidth, time and money wasting. To this end, this paper presents a rough set based model to classify emails into three categories - spam, no-spam and suspicious, rather than two classes (spam and non-spam) in most currently used approaches. By comparing with popular classification methods like Naive Bayes classification, the error ratio that a non-spam is discriminated to spam can be reduced using our proposed model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This project constructs a structural model of the United States Economy. This task is tackled in two separate ways: first econometric methods and then using a neural network, both with a structure that mimics the structure of the U.S. economy. The structural model tracks the performance of U.S. GDP rather well in a dynamic simulation, with an average error of just over 1 percent. The neural network performed well, but suffered from some theoretical, as well as some implementation issues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of our present paper is to derive a computationally efficient genetic pattern learning algorithm to evolutionarily derive the optimal rebalancing weights (i.e. dynamic hedge ratios) to engineer a structured financial product out of a multiasset, best-of option. The stochastic target function is formulated as an expected squared cost of hedging (tracking) error which is assumed to be partly dependent on the governing Markovian process underlying the individual asset returns and partly on
randomness i.e. pure white noise. A simple haploid genetic algorithm is advanced as an alternative numerical scheme, which is deemed to be
computationally more efficient than numerically deriving an explicit solution to the formulated optimization model. An extension to our proposed scheme is suggested by means of adapting the Genetic Algorithm parameters based on fuzzy logic controllers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this article is to present an empirical analysis of complex sample data with regard to the biasing effect of non-independence of observations on standard error parameter estimates. Using field data structured in the form of repeated measurements it is to be shown, in a two-factor confirmatory factor analysis model, how the bias in SE can be derived when the non-independence is ignored.

Design/methodology/approach – Three estimation procedures are compared: normal asymptotic theory (maximum likelihood); non-parametric standard error estimation (naïve bootstrap); and sandwich (robust covariance matrix) estimation (pseudo-maximum likelihood).

Findings – The study reveals that, when using either normal asymptotic theory or non-parametric standard error estimation, the SE bias produced by the non-independence of observations can be noteworthy.

Research limitations/implications –
Considering the methodological constraints in employing field data, the three analyses examined must be interpreted independently and as a result taxonomic generalisations are limited. However, the study still provides “case study” evidence suggesting the existence of the relationship between non-independence of observations and standard error bias estimates.

Originality/value – Given the increasing popularity of structural equation models in the social sciences and in particular in the marketing discipline, the paper provides a theoretical and practical insight into how to treat repeated measures and clustered data in general, adding to previous methodological research. Some conclusions and suggestions for researchers who make use of partial least squares modelling are also drawn.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The current work used discrete event simulation techniques to model the economics of quality within an actual automotive stamping plant. Automotive stamping is a complex, capital intensive process requiring part-specific tooling and specialised machinery. Quality control and quality improvement is difficult in the stamping environment due to the general lack of process understanding and the large number to interacting variables. These factors have prevented the widespread use of statistical process control. In this work, a model of the quality control techniques used at the Ford Geelong Stamping plant is developed and indirectly validated against results from production. To date, most discrete event models are of systems where the quality control process is clearly defined by the rules of statistical process control. However, the quality control technique used within the stamping plant is for the operator to perform a 100% visual inspection while unloading the finished panels. In the developed model, control is enacted after a cumulative count of defective items is observed, thereby approximating the operator who allows a number of defective panels to accumulate before resetting the line. Analysis of this model found that the cost sensitivity to inspection error is dependent upon the level of control and that the level of control determines line utilisation. Additional analysis of this model demonstrated that additional inspection processes would lead to more stable cost structures but these structures many not necessarily be lower cost. The model was subsequently applied to investigate the economics of quality improvement. The quality problem of panel blemishes, induced by slivers (small metal fragments), was chosen as a case stuffy. Errors of 20-30% were observed during direct validation of the cost model and it was concluded that the use of discrete event simulation models for applications requiring high accuracy would not be possible unless the production system was of low complexity. However, the model could be used to evaluate the sensitivity of input factors and investigating the effects of a number of potential improvement opportunities. Therefore, the research concluded that it is possible to use discrete event simulation to determine the quality economics of an actual stamping plant. However, limitations imposed by inability of the model to consider a number of external factors, such as continuous improvement, operator working conditions or wear and the lack of reliable quality data, result in low cost accuracy. Despite this, it still can be demonstrated that discrete event simulation has significant benefits over the alternate modelling methods.