906 resultados para Classical measurement error model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we review simulation and experimental studies of thermal capillary wave fluctuations as an ideal means for probing the underlying disjoining pressure and surface tensions, and more generally, fine details of the Interfacial Hamiltonian Model. We discuss recent simulation results that reveal a film-height-dependent surface tension not accounted for in the classical Interfacial Hamiltonian Model. We show how this observation may be explained bottom-up from sound principles of statistical thermodynamics and discuss some of its implications

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Acoplamiento del sistema informático de control de piso de producción (SFS) con el conjunto de equipos de fabricación (SPE) es una tarea compleja. Tal acoplamiento involucra estándares abiertos y propietarios, tecnologías de información y comunicación, entre otras herramientas y técnicas. Debido a la turbulencia de mercados, ya sea soluciones personalizadas o soluciones basadas en estándares eventualmente requieren un esfuerzo considerable de adaptación. El concepto de acoplamiento débil ha sido identificado en la comunidad de diseño organizacional como soporte para la sobrevivencia de la organización. Su presencia reduce la resistencia de la organización a cambios en el ambiente. En este artículo los resultados obtenidos por la comunidad de diseño organizacional son identificados, traducidos y organizados para apoyar en la solución del problema de integración SFS-SPE. Un modelo clásico de acoplamiento débil, desarrollado por la comunidad de estudios de diseño organizacional, es resumido y trasladado al área de interés. Los aspectos claves son identificados para utilizarse como promotores del acoplamiento débil entre SFS-SPE, y presentados en forma de esquema de referencia. Así mismo, este esquema de referencia es presentado como base para el diseño e implementación de una solución genérica de acoplamiento o marco de trabajo (framework) de acoplamiento, a incluir como etapa de acoplamiento débil entre SFS y SPE. Un ejemplo de validación con varios conjuntos de equipos de fabricación, usando diferentes medios físicos de comunicación, comandos de controlador, lenguajes de programación de equipos y protocolos de comunicación es presentado, mostrando un nivel aceptable de autonomía del SFS. = Coupling shop floor software system (SFS) with the set of production equipment (SPE) becomes a complex task. It involves open and proprietary standards, information and communication technologies among other tools and techniques. Due to market turbulence, either custom solutions or standards based solutions eventually require a considerable effort of adaptation. Loose coupling concept has been identified in the organizational design community as a compensator for organization survival. Its presence reduces organization reaction to environment changes. In this paper the results obtained by the organizational de sign community are identified, translated and organized to support the SFS-SPE integration problem solution. A classical loose coupling model developed by organizational studies community is abstracted and translated to the area of interest. Key aspects are identified to be used as promoters of SFS-SPE loose coupling and presented in a form of a reference scheme. Furthermore, this reference scheme is proposed here as a basis for the design and implementation of a generic coupling solution or coupling framework, that is included as a loose coupling stage between SFS and SPE. A validation example with various sets of manufacturing equipment, using different physical communication media, controller commands, programming languages and wire protocols is presented, showing an acceptable level of autonomy gained by the SFS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Experimental time series for a nonequilibrium reaction may in some cases contain sufficient data to determine a unique kinetic model for the reaction by a systematic mathematical analysis. As an example, a kinetic model for the self-assembly of microtubules is derived here from turbidity time series for solutions in which microtubules assemble. The model may be seen as a generalization of Oosawa's classical nucleation-polymerization model. It reproduces the experimental data with a four-stage nucleation process and a critical nucleus of 15 monomers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper empirically analyses a dataset of more than 7,300 agricultural land sales transactions from 2001 and 2007 to identify the factors influencing agricultural land prices in Bavaria. We use a general spatial model, which combines a spatial lag and a spatial error model, and in addition account for endogeneity introduced by the spatially lagged dependent variable as well as other explanatory variables. Our findings confirm the strong influence of agricultural factors such as land productivity, of variables describing the regional land market structure, and of non-agricultural factors such as urban pressure on agricultural land prices. Moreover, the involvement of public authorities as a seller or buyer increases sales prices in Bavaria. We find a significant capitalisation of government support payments into agricultural land, where a decrease of direct payments by 1% would decrease land prices in 2007 and 2001 by 0.27% and 0.06%, respectively. In addition, we confirm strong spatial relationships in our dataset. Neglecting this leads to biased estimates, especially if aggregated data is used. We find that the price of a specific plot increases by 0.24% when sales prices in surrounding areas increase by 1%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este trabalho tem com objetivo abordar o problema de alocação de ativos (análise de portfólio) sob uma ótica Bayesiana. Para isto foi necessário revisar toda a análise teórica do modelo clássico de média-variância e na sequencia identificar suas deficiências que comprometem sua eficácia em casos reais. Curiosamente, sua maior deficiência não esta relacionado com o próprio modelo e sim pelos seus dados de entrada em especial ao retorno esperado calculado com dados históricos. Para superar esta deficiência a abordagem Bayesiana (modelo de Black-Litterman) trata o retorno esperado como uma variável aleatória e na sequência constrói uma distribuição a priori (baseado no modelo de CAPM) e uma distribuição de verossimilhança (baseado na visão de mercado sob a ótica do investidor) para finalmente aplicar o teorema de Bayes tendo como resultado a distribuição a posteriori. O novo valor esperado do retorno, que emerge da distribuição a posteriori, é que substituirá a estimativa anterior do retorno esperado calculado com dados históricos. Os resultados obtidos mostraram que o modelo Bayesiano apresenta resultados conservadores e intuitivos em relação ao modelo clássico de média-variância.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thèse réalisée en cotutelle entre l'Université de Montréal et l'Université Pierre et Marie Curie, Paris 06, Sorbonne Universités.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We define several quantitative measures of the robustness of a quantum gate against noise. Exact analytic expressions for the robustness against depolarizing noise are obtained for all bipartite unitary quantum gates, and it is found that the controlled-NOT gate is the most robust two-qubit quantum gate, in the sense that it is the quantum gate which can tolerate the most depolarizing noise and still generate entanglement. Our results enable us to place several analytic upper bounds on the value of the threshold for quantum computation, with the best bound in the most pessimistic error model being p(th)less than or equal to0.5.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Regression to the mean (RTM) is a statistical phenomenon that can make natural variation in repeated data look like real change. It happens when unusually large or small measurements tend to be followed by measurements that are closer to the mean. Methods We give some examples of the phenomenon, and discuss methods to overcome it at the design and analysis stages of a study. Results The effect of RTM in a sample becomes more noticeable with increasing measurement error and when follow-up measurements are only examined on a sub-sample selected using a baseline value. Conclusions RTM is a ubiquitous phenomenon in repeated data and should always be considered as a possible cause of an observed change. Its effect can be alleviated through better study design and use of suitable statistical methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is some evidence that dietary factors may modify the risk of squamous cell carcinoma (SCC) of the skin, but the association between food intake and SCC has not been evaluated prospectively. We examined the association between food intake and SCC incidence among 1,056 randomly selected adults living in an Australian sub-tropical community. Measurement-error corrected estimates of intake in 15 food groups were defined from a validated food frequency questionnaire in 1992. Associations with SCC risk were assessed using Poisson and negative binomial regression to the persons affected and tumour counts, respectively, based on incident, histologically confirmed tumours occurring between 1992 and 2002. After multivariable adjustment, none of the food groups was significantly associated with SCC risk. Stratified analysis in participants with a past history of skin cancer showed a decreased risk of SCC tumours for high intakes of green leafy vegetables (RR = 0.45, 95% CI = 0.22-0.91; p for trend = 0.02) and an increased risk for high intake of unmodified dairy products (RR = 2.53, 95% CI: 1.15-5.54; p for trend = 0.03). Food intake was not associated with SCC risk in persons who had no past history of skin cancer. These findings suggest that consumption of green leafy vegetables may help prevent development of subsequent SCCs of the skin among people with previous skin cancer and that consumption of unmodified dairy products, such as whole milk, cheese and yoghurt, may increase SCC risk in susceptible persons.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: This study was conducted to examine the test-retest reliability of a measure of prediagnosis physical activity participation administered to colorecial cancer survivors recruited from a population-based state cancer registry. Methods: A total of 112 participants completed two telephone interviews. I month apart, reporting usual weekly physical activity in the year before their cancer diagnosis. Intraclass correlation coefficients (ICC) and standard en-or of measurement (SEM) were used to describe the test-retest reliability of the measure across the sample: the Bland-Altman approach was used to describe reliability at the individual level. The test-retest reliability for categorized total physical activity (active, insufficiently active, sedentary) was assessed using the kappa statistic. Results: When the complete sample was considered, the ICC ranged from 0.40 (95% Cl: 0.24, 0.55) for vigorous gardening to 0.77 (95% Cl: 0.68, 0.84) for moderate physical activity. The SEM, however, were large. indicating high measurement error. The Bland-Altman plots indicated that the reproducibility of data decreases as the aniount of physical activity reported each week increases The kappa coefficient for the categorized data was 0.62 (95% Cl: 0.48, 0.76). Conclusion: Overall. the results indicated low levels of repeatability for this measure of historical physical activity. Categorizing participants as active, insufficiently active, or sedentary provides a higher level of test-retest reliability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The consensus from published studies is that plasma lipids are each influenced by genetic factors, and that this contributes to genetic variation in risk of cardiovascular disease. Heritability estimates for lipids and lipoproteins are in the range .48 to .87, when measured once per study participant. However, this ignores the confounding effects of biological variation measurement error and ageing, and a truer assessment of genetic effects on cardiovascular risk may be obtained from analysis of longitudinal twin or family data. We have analyzed information on plasma high-density lipoprotein (HDL) and low-density lipoprotein (LDL) cholesterol, and triglycerides, from 415 adult twins who provided blood on two to five occasions over 10 to 17 years. Multivariate modeling of genetic and environmental contributions to variation within and across occasions was used to assess the extent to which genetic and environmental factors have long-term effects on plasma lipids. Results indicated that more than one genetic factor influenced HDL and LDL components of cholesterol, and triglycerides over time in all studies. Nonshared environmental factors did not have significant long-term effects except for HDL. We conclude that when heritability of lipid risk factors is estimated on only one occasion, the existence of biological variation and measurement errors leads to underestimation of the importance of genetic factors as a cause of variation in long-term risk within the population. In addition our data suggest that different genes may affect the risk profile at different ages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many studies of quantitative and disease traits in human genetics rely upon self-reported measures. Such measures are based on questionnaires or interviews and are often cheaper and more readily available than alternatives. However, the precision and potential bias cannot usually be assessed. Here we report a detailed quantitative genetic analysis of stature. We characterise the degree of measurement error by utilising a large sample of Australian twin pairs (857 MZ, 815 DZ) with both clinical and self-reported measures of height. Self-report height measurements are shown to be more variable than clinical measures. This has led to lowered estimates of heritability in many previous studies of stature. In our twin sample the heritability estimate for clinical height exceeded 90%. Repeated measures analysis shows that 2-3 times as many self-report measures are required to recover heritability estimates similar to those obtained from clinical measures. Bivariate genetic repeated measures analysis of self-report and clinical height measures showed an additive genetic correlation > 0.98. We show that the accuracy of self-report height is upwardly biased in older individuals and in individuals of short stature. By comparing clinical and self-report measures we also showed that there was a genetic component to females systematically reporting their height incorrectly; this phenomenon appeared to not be present in males. The results from the measurement error analysis were subsequently used to assess the effects of error on the power to detect linkage in a genome scan. Moderate reduction in error (through the use of accurate clinical or multiple self-report measures) increased the effective sample size by 22%; elimination of measurement error led to increases in effective sample size of 41%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose - The purpose of this study is to develop a performance measurement model for service operations using the analytic hierarchy process approach. Design/methodology/approach - The study reviews current relevant literature on performance measurement and develops a model for performance measurement. The model is then applied to the intensive care units (ICUs) of three different hospitals in developing nations. Six focus group discussions were undertaken, involving experts from the specific area under investigation, in order to develop an understandable performance measurement model that was both quantitative and hierarchical. Findings - A combination of outcome, structure and process-based factors were used as a foundation for the model. The analyses of the links between them were used to reveal the relative importance of each and their associated sub factors. It was considered to be an effective quantitative tool by the stakeholders. Research limitations/implications - This research only applies the model to ICUs in healthcare services. Practical implications - Performance measurement is an important area within the operations management field. Although numerous models are routinely being deployed both in practice and research, there is always room for improvement. The present study proposes a hierarchical quantitative approach, which considers both subjective and objective performance criteria. Originality/value - This paper develops a hierarchical quantitative model for service performance measurement. It considers success factors with respect to outcomes, structure and processes with the involvement of the concerned stakeholders based upon the analytic hierarchy process approach. The unique model is applied to the ICUs of hospitals in order to demonstrate its effectiveness. The unique application provides a comparative international study of service performance measurement in ICUs of hospitals in three different countries. © Emerald Group Publishing Limited.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study presents some quantitative evidence from a number of simulation experiments on the accuracy of the productivitygrowth estimates derived from growthaccounting (GA) and frontier-based methods (namely data envelopment analysis-, corrected ordinary least squares-, and stochastic frontier analysis-based malmquist indices) under various conditions. These include the presence of technical inefficiency, measurement error, misspecification of the production function (for the GA and parametric approaches) and increased input and price volatility from one period to the next. The study finds that the frontier-based methods usually outperform GA, but the overall performance varies by experiment. Parametric approaches generally perform best when there is no functional form misspecification, but their accuracy greatly diminishes otherwise. The results also show that the deterministic approaches perform adequately even under conditions of (modest) measurement error and when measurement error becomes larger, the accuracy of all approaches (including stochastic approaches) deteriorates rapidly, to the point that their estimates could be considered unreliable for policy purposes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several levels of complexity are available for modelling of wastewater treatment plants. Modelling local effects rely on computational fluid dynamics (CFD) approaches whereas activated sludge models (ASM) represent the global methodology. By applying both modelling approaches to pilot plant and full scale systems, this paper evaluates the value of each method and especially their potential combination. Model structure identification for ASM is discussed based on a full-scale closed loop oxidation ditch modelling. It is illustrated how and for what circumstances information obtained via CFD (computational fluid dynamics) analysis, residence time distribution (RTD) and other experimental means can be used. Furthermore, CFD analysis of the multiphase flow mechanisms is employed to obtain a correct description of the oxygenation capacity of the system studied, including an easy implementation of this information in the classical ASM modelling (e.g. oxygen transfer). The combination of CFD and activated sludge modelling of wastewater treatment processes is applied to three reactor configurations, a perfectly mixed reactor, a pilot scale activated sludge basin (ASB) and a real scale ASB. The application of the biological models to the CFD model is validated against experimentation for the pilot scale ASB and against a classical global ASM model response. A first step in the evaluation of the potential of the combined CFD-ASM model is performed using a full scale oxidation ditch system as testing scenario.