20 resultados para model validation

em University of Queensland eSpace - Australia


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Validation procedures play an important role in establishing the credibility of models, improving their relevance and acceptability. This article reviews the testing of models relevant to environmental and natural resource management with particular emphasis on models used in multicriteria analysis (MCA). Validation efforts for a model used in a MCA catchment management study in North Queensland, Australia, are presented. Determination of face validity is found to be a useful approach in evaluating this model, and sensitivity analysis is useful in checking the stability of the model. (C) 2000 Elsevier Science Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Traditional vegetation mapping methods use high cost, labour-intensive aerial photography interpretation. This approach can be subjective and is limited by factors such as the extent of remnant vegetation, and the differing scale and quality of aerial photography over time. An alternative approach is proposed which integrates a data model, a statistical model and an ecological model using sophisticated Geographic Information Systems (GIS) techniques and rule-based systems to support fine-scale vegetation community modelling. This approach is based on a more realistic representation of vegetation patterns with transitional gradients from one vegetation community to another. Arbitrary, though often unrealistic, sharp boundaries can be imposed on the model by the application of statistical methods. This GIS-integrated multivariate approach is applied to the problem of vegetation mapping in the complex vegetation communities of the Innisfail Lowlands in the Wet Tropics bioregion of Northeastern Australia. The paper presents the full cycle of this vegetation modelling approach including sampling sites, variable selection, model selection, model implementation, internal model assessment, model prediction assessments, models integration of discrete vegetation community models to generate a composite pre-clearing vegetation map, independent data set model validation and model prediction's scale assessments. An accurate pre-clearing vegetation map of the Innisfail Lowlands was generated (0.83r(2)) through GIS integration of 28 separate statistical models. This modelling approach has good potential for wider application, including provision of. vital information for conservation planning and management; a scientific basis for rehabilitation of disturbed and cleared areas; a viable method for the production of adequate vegetation maps for conservation and forestry planning of poorly-studied areas. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Reliability or validity studies are important for the evaluation of measurement error in dietary assessment methods. An approach to validation known as the method of triads uses triangulation techniques to calculate the validity coefficient of a food-frequency questionnaire (FFQ). Objective: To assess the validity of an FFQ estimates of carotenoid and vitamin E intake against serum biomarker measurements and weighed food records (WFRs), by applying the method of triads. Design: The study population was a sub-sample of adult participants in a randomised controlled trial of beta-carotene and sunscreen in the prevention of skin cancer. Dietary intake was assessed by a self-administered FFQ and a WFR. Nonfasting blood samples were collected and plasma analysed for five carotenoids (alpha-carotene, beta-carotene, beta-cryptoxanthin, lutein, lycopene) and vitamin E. Correlation coefficients were calculated between each of the dietary methods and the validity coefficient was calculated using the method of triads. The 95% confidence intervals for the validity coefficients were estimated using bootstrap sampling. Results: The validity coefficients of the FFQ were highest for alpha-carotene (0.85) and lycopene (0.62), followed by beta- carotene (0.55) and total carotenoids (0.55), while the lowest validity coefficient was for lutein (0.19). The method of triads could not be used for b- cryptoxanthin and vitamin E, as one of the three underlying correlations was negative. Conclusions: Results were similar to other studies of validity using biomarkers and the method of triads. For many dietary factors, the upper limit of the validity coefficients was less than 0.5 and therefore only strong relationships between dietary exposure and disease will be detected.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Published birthweight references in Australia do not fully take into account constitutional factors that influence birthweight and therefore may not provide an accurate reference to identify the infant with abnormal growth. Furthermore, studies in other regions that have derived adjusted (customised) birthweight references have applied untested assumptions in the statistical modelling. Aims: To validate the customised birthweight model and to produce a reference set of coefficients for estimating a customised birthweight that may be useful for maternity care in Australia and for future research. Methods: De-identified data were extracted from the clinical database for all births at the Mater Mother's Hospital, Brisbane, Australia, between January 1997 and June 2005. Births with missing data for the variables under study were excluded. In addition the following were excluded: multiple pregnancies, births less than 37 completed week's gestation, stillbirths, and major congenital abnormalities. Multivariate analysis was undertaken. A double cross-validation procedure was used to validate the model. Results: The study of 42 206 births demonstrated that, for statistical purposes, birthweight is normally distributed. Coefficients for the derivation of customised birthweight in an Australian population were developed and the statistical model is demonstrably robust. Conclusions: This study provides empirical data as to the robustness of the model to determine customised birthweight. Further research is required to define where normal physiology ends and pathology begins, and which segments of the population should be included in the construction of a customised birthweight standard.

Relevância:

40.00% 40.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Orthotopic liver retransplantation (re-OLT) is highly controversial. The objectives of this study were to determine the validity of a recently developed United Network for Organ Sharing (UNOS) multivariate model using an independent cohort of patients undergoing re-OLT outside the United States, to determine whether incorporation of other variables that were incomplete in the UNOS registry would provide additional prognostic information, to develop new models combining data sets from both cohorts, and to evaluate the validity of the model for end-stage liver disease (MELD) in patients undergoing re-OLT. Two hundred eighty-one adult patients undergoing re-OLT (between 1986 and 1999) at 6 foreign transplant centers comprised the validation cohort. We found good agreement between actual survival and predicted survival in the validation cohort; 1-year patient survival rates in the low-, intermediate-, and high-risk groups (as assigned by the original UNOS model) were 72%, 68%, and 36%, respectively (P < .0001). In the patients for whom the international normalized ratio (INR) of prothrombin time was available, MELD correlated with outcome following re-OLT; the median MELD scores for patients surviving at least 90 days compared with those dying within 90 days were 20.75 versus 25.9, respectively (P = .004). Utilizing both patient cohorts (n = 979), a new model, based on recipient age, total serum bilirubin, creatinine, and interval to re-OLT, was constructed (whole model χ(2) = 105, P < .0001). Using the c-statistic with 30-day, 90-day, 1-year, and 3-year mortality as the end points, the area under the receiver operating characteristic (ROC) curves for 4 different models were compared. In conclusion, prospective validation and use of these models as adjuncts to clinical decision making in the management of patients being considered for re-OLT are warranted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mineral processing plants use two main processes; these are comminution and separation. The objective of the comminution process is to break complex particles consisting of numerous minerals into smaller simpler particles where individual particles consist primarily of only one mineral. The process in which the mineral composition distribution in particles changes due to breakage is called 'liberation'. The purpose of separation is to separate particles consisting of valuable mineral from those containing nonvaluable mineral. The energy required to break particles to fine sizes is expensive, and therefore the mineral processing engineer must design the circuit so that the breakage of liberated particles is reduced in favour of breaking composite particles. In order to effectively optimize a circuit through simulation it is necessary to predict how the mineral composition distributions change due to comminution. Such a model is called a 'liberation model for comminution'. It was generally considered that such a model should incorporate information about the ore, such as the texture. However, the relationship between the feed and product particles can be estimated using a probability method, with the probability being defined as the probability that a feed particle of a particular composition and size will form a particular product particle of a particular size and composition. The model is based on maximizing the entropy of the probability subject to mass constraints and composition constraint. Not only does this methodology allow a liberation model to be developed for binary particles, but also for particles consisting of many minerals. Results from applying the model to real plant ore are presented. A laboratory ball mill was used to break particles. The results from this experiment were used to estimate the kernel which represents the relationship between parent and progeny particles. A second feed, consisting primarily of heavy particles subsampled from the main ore was then ground through the same mill. The results from the first experiment were used to predict the product of the second experiment. The agreement between the predicted results and the actual results are very good. It is therefore recommended that more extensive validation is needed to fully evaluate the substance of the method. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: The Temptation and Restraint Inventory (TRI) is commonly used to measure drinking restraint in relation to problem drinking behavior. However, as yet the TRI has not been validated in a clinical group with alcohol dependence. Method: Male (n = 111) and female (n = 57) inpatients with DSM-IV diagnosed alcohol dependence completed the TRI and measures of problem drinking severity, including the Alcohol Dependence Scale and the quantity, frequency and week total of alcohol consumed. Results: The factor structure of the TRI was replicated in the alcohol dependent sample. Cognitive Emotional Preoccupation (CEP), one of the two higher order factors of the TRI, demonstrated sound predictive power toward all dependence severity indices. The other higher order factor, Cognitive Behavioral Control (CBC), was related to frequency of drinking. There was limited support for the CEP/CBC interactional model of drinking restraint. Conclusions: Although the construct validity of the TRI was sound, the measure appears more useful in understanding the development, maintenance and severity of alcohol-related problems in nondependent drinkers. The TRI may show promise in detecting either continuous drinking or heavy episodic type dependent drinkers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims The aims of this study are to develop and validate a measure to screen for a range of gambling-related cognitions (GRC) in gamblers. Design and participants A total of 968 volunteers were recruited from a community-based population. They were divided randomly into two groups. Principal axis factoring with varimax rotation was performed on group one and confirmatory factor analysis (CFA) was used on group two to confirm the best-fitted solution. Measurements The Gambling Related Cognition Scale (GRCS) was developed for this study and the South Oaks Gambling Screen (SOGS), the Motivation Towards Gambling Scale (MTGS) and the Depression Anxiety Stress Scale (DASS-2 1) were used for validation. Findings Exploratory factor analysis performed using half the sample indicated five factors, which included interpretative control/bias (GRCS-IB), illusion of control (GRCS-IC), predictive control (GRCS-PC), gambling-related expectancies (GRCS-GE) and a perceived inability to stop gambling (GRCS-IS). These accounted for 70% of the total variance. Using the other half of the sample, CFA confirmed that the five-factor solution fitted the data most effectively. Cronbach's alpha coefficients for the factors ranged from 0.77 to 0.91, and 0.93 for the overall scale. Conclusions This paper demonstrated that the 23-item GRCS has good psychometric properties and thus is a useful instrument for identifying GRC among non-clinical gamblers. It provides the first step towards devising/adapting similar tools for problem gamblers as well as developing more specialized instruments to assess particular domains of GRC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The urge to gamble is a physiological, psychological, or emotional motivational state, often associated with continued gambling. The authors developed and validated the 6-item Gambling Urge Questionnaire (GUS), which was based on the 8-item Alcohol Urge Questionnaire (M. J. Bohn, D. D. Krahn, & B. A. Staehler, 1995), using 968 community-based participants. Exploratory factor analysis using half of the sample indicated a 1-factor solution that accounted for 55.18% of the total variance. This was confirmed using confirmatory factor analysis with the other half of the sample. The GUS had a Cronbach's alpha coefficient of .81. Concurrent, predictive, and criterion-related validity of the GUS were good, suggesting that the GUS is a valid and reliable instrument for assessing gambling urges among nonclinical gamblers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sorghum is the main dryland summer crop in NE Australia and a number of agricultural businesses would benefit from an ability to forecast production likelihood at regional scale. In this study we sought to develop a simple agro-climatic modelling approach for predicting shire (statistical local area) sorghum yield. Actual shire yield data, available for the period 1983-1997 from the Australian Bureau of Statistics, were used to train the model. Shire yield was related to a water stress index (SI) that was derived from the agro-climatic model. The model involved a simple fallow and crop water balance that was driven by climate data available at recording stations within each shire. Parameters defining the soil water holding capacity, maximum number of sowings (MXNS) in any year, planting rainfall requirement, and critical period for stress during the crop cycle were optimised as part of the model fitting procedure. Cross-validated correlations (CVR) ranged from 0.5 to 0.9 at shire scale. When aggregated to regional and national scales, 78-84% of the annual variation in sorghum yield was explained. The model was used to examine trends in sorghum productivity and the approach to using it in an operational forecasting system was outlined. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cereal-legume intercropping plays an important role in subsistence food production in developing countries, especially in situations of limited water resources. Crop simulation can be used to assess risk for intercrop productivity over time and space. In this study, a simple model for intercropping was developed for cereal and legume growth and yield, under semi-arid conditions. The model is based on radiation interception and use, and incorporates a water stress factor. Total dry matter and yield are functions of photosynthetically active radiation (PAR), the fraction of radiation intercepted and radiation use efficiency (RUE). One of two PAR sub-models was used to estimate PAR from solar radiation; either PAR is 50% of solar radiation or the ratio of PAR to solar radiation (PAR/SR) is a function of the clearness index (K-T). The fraction of radiation intercepted was calculated either based on Beer's Law with crop extinction coefficients (K) from field experiments or from previous reports. RUE was calculated as a function of available soil water to a depth of 900 mm (ASW). Either the soil water balance method or the decay curve approach was used to determine ASW. Thus, two alternatives for each of three factors, i.e., PAR/SR, K and ASW, were considered, giving eight possible models (2 methods x 3 factors). The model calibration and validation were carried out with maize-bean intercropping systems using data collected in a semi-arid region (Bloemfontein, Free State, South Africa) during seven growing seasons (1996/1997-2002/2003). The combination of PAR estimated from the clearness index, a crop extinction coefficient from the field experiment and the decay curve model gave the most reasonable and acceptable result. The intercrop model developed in this study is simple, so this modelling approach can be employed to develop other cereal-legume intercrop models for semi-arid regions. (c) 2004 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The schema of an information system can significantly impact the ability of end users to efficiently and effectively retrieve the information they need. Obtaining quickly the appropriate data increases the likelihood that an organization will make good decisions and respond adeptly to challenges. This research presents and validates a methodology for evaluating, ex ante, the relative desirability of alternative instantiations of a model of data. In contrast to prior research, each instantiation is based on a different formal theory. This research theorizes that the instantiation that yields the lowest weighted average query complexity for a representative sample of information requests is the most desirable instantiation for end-user queries. The theory was validated by an experiment that compared end-user performance using an instantiation of a data structure based on the relational model of data with performance using the corresponding instantiation of the data structure based on the object-relational model of data. Complexity was measured using three different Halstead metrics: program length, difficulty, and effort. For a representative sample of queries, the average complexity using each instantiation was calculated. As theorized, end users querying the instantiation with the lower average complexity made fewer semantic errors, i.e., were more effective at composing queries. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Workflow systems have traditionally focused on the so-called production processes which are characterized by pre-definition, high volume, and repetitiveness. Recently, the deployment of workflow systems in non-traditional domains such as collaborative applications, e-learning and cross-organizational process integration, have put forth new requirements for flexible and dynamic specification. However, this flexibility cannot be offered at the expense of control, a critical requirement of business processes. In this paper, we will present a foundation set of constraints for flexible workflow specification. These constraints are intended to provide an appropriate balance between flexibility and control. The constraint specification framework is based on the concept of pockets of flexibility which allows ad hoc changes and/or building of workflows for highly flexible processes. Basically, our approach is to provide the ability to execute on the basis of a partially specified model, where the full specification of the model is made at runtime, and may be unique to each instance. The verification of dynamically built models is essential. Where as ensuring that the model conforms to specified constraints does not pose great difficulty, ensuring that the constraint set itself does not carry conflicts and redundancy is an interesting and challenging problem. In this paper, we will provide a discussion on both the static and dynamic verification aspects. We will also briefly present Chameleon, a prototype workflow engine that implements these concepts. (c) 2004 Elsevier Ltd. All rights reserved.