30 resultados para Probabilistic methodology


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The presentation of an aesthetic identity involves the accomplishment of a coherent, plausible narrative which links one's choices to desired characteristics of the self. As symbolic evidence of a person's taste, material culture is a vital component of a successful narrative. Via case studies of pivotal household objects, this paper uses face-to-face interview data as a way of investigating processes of aesthetic choice. Household objects are interpreted as material elements imbricated in the presentation of a socially plausible and internally consistent aesthetic self. Narrative analysis, and the concept of the epiphany-object, are proposed as useful ways of accounting for tastes in domestic material culture. Methodological questions of truth-telling and authenticity in the face-to-face context are considered, and the sociological problem of taste is scrutinized in light of ideas about social accountability and textual identity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bond's method for ball mill scale-up only gives the mill power draw for a given duty. This method is incompatible with computer modelling and simulation techniques. It might not be applicable for the design of fine grinding ball mills and ball mills preceded by autogenous and semi-autogenous grinding mills. Model-based ball mill scale-up methods have not been validated using a wide range of full-scale circuit data. Their accuracy is therefore questionable. Some of these methods also need expensive pilot testing. A new ball mill scale-up procedure is developed which does not have these limitations. This procedure uses data from two laboratory tests to determine the parameters of a ball mill model. A set of scale-up criteria then scales-up these parameters. The procedure uses the scaled-up parameters to simulate the steady state performance of full-scale mill circuits. At the end of the simulation, the scale-up procedure gives the size distribution, the volumetric flowrate and the mass flowrate of all the streams in the circuit, and the mill power draw.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Historically, few articles have addressed the use of district level mill production data for analysing the effect of varietal change on sugarcane productivity trends. This appears to be due to lack of compiled district data sets and appropriate methods by which to analyse these data. Recently, varietal data on tonnes of sugarcane per hectare (TCH), sugar content (CCS), and their product, tonnes of sugar content per hectare (TSH) on a district basis, have been compiled. This study was conducted to develop a methodology for regular analysis of such data from mill districts to assess productivity trends over time, accounting for variety and variety x environment interaction effects for 3 mill districts (Mulgrave, Babinda, and Tully) from 1958 to 1995. Restricted maximum likelihood methodology was used to analyse the district level data and best linear unbiased predictors for random effects, and best linear unbiased estimates for fixed effects were computed in a mixed model analysis. In the combined analysis over districts, Q124 was the top ranking variety for TCH, and Q120 was top ranking for both CCS and TSH. Overall production for TCH increased over the 38-year period investigated. Some of this increase can be attributed to varietal improvement, although the predictors for TCH have shown little progress since the introduction of Q99 in 1976. Although smaller gains have been made in varietal improvement for CCS, overall production for CCS decreased over the 38 years due to non-varietal factors. Varietal improvement in TSH appears to have peaked in the mid-1980s. Overall production for TSH remained stable over time due to the varietal increase in TCH and the non-varietal decrease in CCS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present paper addresses two major concerns that were identified when developing neural network based prediction models and which can limit their wider applicability in the industry. The first problem is that it appears neural network models are not readily available to a corrosion engineer. Therefore the first part of this paper describes a neural network model of CO2 corrosion which was created using a standard commercial software package and simple modelling strategies. It was found that such a model was able to capture practically all of the trends noticed in the experimental data with acceptable accuracy. This exercise has proven that a corrosion engineer could readily develop a neural network model such as the one described below for any problem at hand, given that sufficient experimental data exist. This applies even in the cases when the understanding of the underlying processes is poor. The second problem arises from cases when all the required inputs for a model are not known or can be estimated with a limited degree of accuracy. It seems advantageous to have models that can take as input a range rather than a single value. One such model, based on the so-called Monte Carlo approach, is presented. A number of comparisons are shown which have illustrated how a corrosion engineer might use this approach to rapidly test the sensitivity of a model to the uncertainities associated with the input parameters. (C) 2001 Elsevier Science Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A methodology and framework for discipline-specific curriculum development in a local context is described. These activities, as part of the Thailand-Australia Science and Engineering Assistance Project, were in response to a needs analysis for curriculum assistance to a number of publicly-funded Thai universities in the engineering priority area of Materials Processing and Manufacturing. The paper outlines a strategy for the delivery of a centralised curriculum development workshop for academic staff follow-up visits and local curriculum activities with participating universities, and the presentation of technical short courses as guidance for such activity in other settings and/or discipline areas. This paper is part of a process of documentation so that others can apply the developed methodology and framework for curriculum development. While the paper is a report on curriculum activities in a particular setting, it is written in a manner that allows application of the methodology to other settings. The reader is advised that each curriculum activity needs to adopt a methodology and strategy to fit the particular circumstances being considered To assist in applying this approach elsewhere, a description of the various steps in the curriculum process, and typical responses to some of the more global issues, have been presented. Full details are available in the various TASEAP reports prepared by the authors. Specific detail has been omitted where this detail does not provide any information for generalized consumption.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cervical auscultation is in the process of gaining clinical credibility. In order for it to be accepted by the clinical community, the procedure and equipment used must first be standardized. Takahashi et al. [Dysphagia 9:54-62, 1994] attempted to provide benchmark methodology for administering cervical auscultation. They provided information about the acoustic detector unit best suited to picking up swallowing sounds and the best cervical site to place it. The current investigation provides contrasting results to Takahashi et al. with respect to the best type of acoustic detector unit to use for detecting swallowing sounds. Our study advocates an electret microphone as opposed to an accelerometer for recording swallowing sounds. However, we agree on the optimal placement site. We conclude that cervical auscultation is within reach of the average dysphagia clinic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the last 7 years, a method has been developed to analyse building energy performance using computer simulation, in Brazil. The method combines analysis of building design plans and documentation, walk-through visits, electric and thermal measurements and the use of an energy simulation tool (DOE-2.1E code), The method was used to model more than 15 office buildings (more than 200 000 m(2)), located between 12.5degrees and 27.5degrees South latitude. The paper describes the basic methodology, with data for one building and presents additional results for other six cases. (C) 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study was threefold: first, the study was designed to illustrate the use of data and information collected in food safety surveys in a quantitative risk assessment. In this case, the focus was on the food service industry; however, similar data from other parts of the food chain could be similarly incorporated. The second objective was to quantitatively describe and better understand the role that the food service industry plays in the safety of food. The third objective was to illustrate the additional decision-making information that is available when uncertainty and variability are incorporated into the modelling of systems. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For dynamic simulations to be credible, verification of the computer code must be an integral part of the modelling process. This two-part paper describes a novel approach to verification through program testing and debugging. In Part 1, a methodology is presented for detecting and isolating coding errors using back-to-back testing. Residuals are generated by comparing the output of two independent implementations, in response to identical inputs. The key feature of the methodology is that a specially modified observer is created using one of the implementations, so as to impose an error-dependent structure on these residuals. Each error can be associated with a fixed and known subspace, permitting errors to be isolated to specific equations in the code. It is shown that the geometric properties extend to multiple errors in either one of the two implementations. Copyright (C) 2003 John Wiley Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Regional commodity forecasts are being used increasingly in agricultural industries to enhance their risk management and decision-making processes. These commodity forecasts are probabilistic in nature and are often integrated with a seasonal climate forecast system. The climate forecast system is based on a subset of analogue years drawn from the full climatological distribution. In this study we sought to measure forecast quality for such an integrated system. We investigated the quality of a commodity (i.e. wheat and sugar) forecast based on a subset of analogue years in relation to a standard reference forecast based on the full climatological set. We derived three key dimensions of forecast quality for such probabilistic forecasts: reliability, distribution shift, and change in dispersion. A measure of reliability was required to ensure no bias in the forecast distribution. This was assessed via the slope of the reliability plot, which was derived from examination of probability levels of forecasts and associated frequencies of realizations. The other two dimensions related to changes in features of the forecast distribution relative to the reference distribution. The relationship of 13 published accuracy/skill measures to these dimensions of forecast quality was assessed using principal component analysis in case studies of commodity forecasting using seasonal climate forecasting for the wheat and sugar industries in Australia. There were two orthogonal dimensions of forecast quality: one associated with distribution shift relative to the reference distribution and the other associated with relative distribution dispersion. Although the conventional quality measures aligned with these dimensions, none measured both adequately. We conclude that a multi-dimensional approach to assessment of forecast quality is required and that simple measures of reliability, distribution shift, and change in dispersion provide a means for such assessment. The analysis presented was also relevant to measuring quality of probabilistic seasonal climate forecasting systems. The importance of retaining a focus on the probabilistic nature of the forecast and avoiding simplifying, but erroneous, distortions was discussed in relation to applying this new forecast quality assessment paradigm to seasonal climate forecasts. Copyright (K) 2003 Royal Meteorological Society.