80 resultados para Data modelling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background. The present paper describes a component of a large Population cost-effectiveness study that aimed to identify the averted burden and economic efficiency of current and optimal treatment for the major mental disorders. This paper reports on the findings for the anxiety disorders (panic disorder/agoraphobia, social phobia, generalized anxiety disorder, post-traumatic stress disorder and obsessive-compulsive disorder). Method. Outcome was calculated as averted 'years lived with disability' (YLD), a population summary measure of disability burden. Costs were the direct health care costs in 1997-8 Australian dollars. The cost per YLD averted (efficiency) was calculated for those already in contact with the health system for a mental health problem (current care) and for a hypothetical optimal care package of evidence-based treatment for this same group. Data sources included the Australian National Survey of Mental Health and Well-being and published treatment effects and unit costs. Results. Current coverage was around 40% for most disorders with the exception of social phobia at 21%. Receipt of interventions consistent with evidence-based care ranged from 32% of those in contact with services for social phobia to 64% for post-traumatic stress disorder. The cost of this care was estimated at $400 million, resulting in a cost per YLD averted ranging from $7761 for generalized anxiety disorder to $34 389 for panic/agoraphobia. Under optimal care, costs remained similar but health gains were increased substantially, reducing the cost per YLD to < $20 000 for all disorders. Conclusions. Evidence-based care for anxiety disorders would produce greater population health gain at a similar cost to that of current care, resulting in a substantial increase in the cost-effectiveness of treatment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyses the time series behaviour of the initial public offering (IPO) market using an equilibrium model of demand and supply that incorporates the number of new issues, average underpricing, and general market conditions. Model predictions include the existence of serial correlation in both the number of new issues and the average level of underpricing, as well as interactions between these variables and the impact of general market conditions. The model is tested using 40 years of monthly IPO data. The empirical results are generally consistent with predictions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reviews the key features of an environment to support domain users in spatial information system (SIS) development. It presents a full design and prototype implementation of a repository system for the storage and management of metadata, focusing on a subset of spatial data integrity constraint classes. The system is designed to support spatial system development and customization by users within the domain that the system will operate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research in conditioning (all the processes of preparation for competition) has used group research designs, where multiple athletes are observed at one or more points in time. However, empirical reports of large inter-individual differences in response to conditioning regimens suggest that applied conditioning research would greatly benefit from single-subject research designs. Single-subject research designs allow us to find out the extent to which a specific conditioning regimen works for a specific athlete, as opposed to the average athlete, who is the focal point of group research designs. The aim of the following review is to outline the strategies and procedures of single-subject research as they pertain to.. the assessment of conditioning for individual athletes. The four main experimental designs in single-subject research are: the AB design, reversal (withdrawal) designs and their extensions, multiple baseline designs and alternating treatment designs. Visual and statistical analyses commonly used to analyse single-subject data, and advantages and limitations are discussed. Modelling of multivariate single-subject data using techniques such as dynamic factor analysis and structural equation modelling may identify individualised models of conditioning leading to better prediction of performance. Despite problems associated with data analyses in single-subject research (e.g. serial dependency), sports scientists should use single-subject research designs in applied conditioning research to understand how well an intervention (e.g. a training method) works and to predict performance for a particular athlete.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A pulse of chromated copper arsenate (CCA, a timber preservative) was applied in irrigation water to an undisturbed field soil in a laboratory column. Concentrations of various elements in the leachate from the column were measured during the experiment. Also, the remnants within the soil were measured at the end of the experiment. The geochemical modelling package, PHREEQC-2, was used to simulate the experimental data. Processes included in the CCA transport modelling were advection, dispersion, non-specific adsorption (cation exchange) and specific adsorption by clay minerals and organic matter, as well as other possible chemical reactions such as precipitation/dissolution. The modelling effort highlighted the possible complexities in CCA transport and reaction experiments. For example, the uneven dosing of CCA as well as incomplete knowledge of the soil properties resulted in simulations that gave only partial, although reasonable, agreement with the experimental data. Both the experimental data and simulations show that As and Cu are strongly adsorbed and therefore, will mostly remain at the top of the soil profile, with a small proportion appearing in leachate. On the other hand, Cr is more mobile and thus it is present in the soil column leachate. Further simulations show that both the quantity of CCA added to the soil and the pH of the irrigation water will influence CCA transport. Simulations suggest that application of larger doses of CCA to the soil will result in higher leachate concentrations, especially for Cu and As. Irrigation water with a lower pH will dramatically increase leaching of Cu. These results indicate that acidic rainfall or significant accidental spillage of CCA will increase the risk of groundwater pollution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates how demographic (socioeconomic) and land-use (physical and environmental) data can be integrated within a decision support framework to formulate and evaluate land-use planning scenarios. A case-study approach is undertaken with land-use planning scenarios for a rapidly growing coastal area in Australia, the Shire of Hervey Bay. The town and surrounding area require careful planning of the future urban growth between competing land uses. Three potential urban growth scenarios are put forth to address this issue. Scenario A ('continued growth') is based on existing socioeconomic trends. Scenario B ('maximising rates base') is derived using optimisation modelling of land-valuation data. Scenario C ('sustainable development') is derived using a number of social, economic, and environmental factors and assigning weightings of importance to each factor using a multiple criteria analysis approach. The land-use planning scenarios are presented through the use of maps and tables within a geographical information system, which delineate future possible land-use allocations up until 2021. The planning scenarios are evaluated by using a goal-achievement matrix approach. The matrix is constructed with a number of criteria derived from key policy objectives outlined in the regional growth management framework and town planning schemes. The authors of this paper examine the final efficiency scores calculated for each of the three planning scenarios and discuss the advantages and disadvantages of the three land-use modelling approaches used to formulate the final scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The central composite rotatable design (CCRD) was used to design an experimental program to model the effects of inlet pressure, feed density, and length and diameter of the inner vortex finder on the operational performance of a 150-min three-product cyclone. The ranges of values of the variables used in the design were: inlet pressure: 80-130 kPa, feed density: 30 60%; length of IVF below the OVF: 50-585 mm; diameter of IVF: 35-50 mm. A total of 30 tests were conducted, which is 51 less; an that required for a three-level full factorial design. Because the model allows confident performance prediction by interpolation over the range of data in the database, it was used to construct response surface graphs to describe the effects of the variables on the performance of the three-product cyclone. To obtain a simple and yet a realistic model, it was refitted using only the variable terms that are significant at greater than or equal to 90% confidence level. Considering the selected operating variables, the resultant model is significant and predicts the experimental data well. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nucleation is the first stage in any granulation process where binder liquid first comes into contact with the powder. This paper investigates the nucleation process where binder liquid is added to a fine powder with a spray nozzle. The dimensionless spray flux approach of Hapgood et al. (Powder Technol. 141 (2004) 20) is extended to account for nonuniform spray patterns and allow for overlap of nuclei granules rather than spray drops. A dimensionless nuclei distribution function which describes the effects of the design and operating parameters of the nucleation process (binder spray characteristics, the nucleation area ratio between droplets and nuclei and the powder bed velocity) on the fractional surface area coverage of nuclei on a moving powder bed is developed. From this starting point, a Monte Carlo nucleation model that simulates full nuclei size distributions as a function of the design and operating parameters that were implemented in the dimensionless nuclei distribution function is developed. The nucleation model was then used to investigate the effects of the design and operating parameters on the formed nuclei size distributions and to correlate these effects to changes of the dimensionless nuclei distribution function. Model simulations also showed that it is possible to predict nuclei size distributions beyond the drop controlled nucleation regime in Hapgood's nucleation regime map. Qualitative comparison of model simulations and experimental nucleation data showed similar shapes of the nuclei size distributions. In its current form, the nucleation model can replace the nucleation term in one-dimensional population balance models describing wet granulation processes. Implementation of more sophisticated nucleation kinetics can make the model applicable to multi-dimensional population balance models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Solids concentration and particle size distribution gradually change in the vertical dimension of industrial flotation cells, subject primarily to the flotation cell size and design and the cell operating conditions. As entrainment is a two-step process and involves only the suspended solids in the top pulp region near the pulp-froth interface, the solids suspension characteristics have a significant impact on the overall entrainment. In this paper, a classification function is proposed to describe the state of solids suspension in flotation cells, similar to the definition of degree of entrainment for classification in the froth phase found in the literature. A mathematical model for solids suspension is also developed, in which the classification function is expressed as an exponential function of the particle size. Experimental data collected from three different Outokumpu tank flotation cells in three different concentrators are well fitted by the proposed exponential model. Under the prevailing experimental conditions, it was found that the solids content in the top region was relatively independent of cell operating conditions such as froth height and air rate but dependent on the cell size. Moreover, the results obtained from the total solids tend to be similar to those from a particular gangue mineral and hence may be applied to all minerals in entrainment calculation. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New tools derived from advances in molecular biology have not been widely adopted in plant breeding for complex traits because of the inability to connect information at gene level to the phenotype in a manner that is useful for selection. In this study, we explored whether physiological dissection and integrative modelling of complex traits could link phenotype complexity to underlying genetic systems in a way that enhanced the power of molecular breeding strategies. A crop and breeding system simulation study on sorghum, which involved variation in 4 key adaptive traits-phenology, osmotic adjustment, transpiration efficiency, stay-green-and a broad range of production environments in north-eastern Australia, was used. The full matrix of simulated phenotypes, which consisted of 547 location-season combinations and 4235 genotypic expression states, was analysed for genetic and environmental effects. The analysis was conducted in stages assuming gradually increased understanding of gene-to-phenotype relationships, which would arise from physiological dissection and modelling. It was found that environmental characterisation and physiological knowledge helped to explain and unravel gene and environment context dependencies in the data. Based on the analyses of gene effects, a range of marker-assisted selection breeding strategies was simulated. It was shown that the inclusion of knowledge resulting from trait physiology and modelling generated an enhanced rate of yield advance over cycles of selection. This occurred because the knowledge associated with component trait physiology and extrapolation to the target population of environments by modelling removed confounding effects associated with environment and gene context dependencies for the markers used. Developing and implementing this gene-to-phenotype capability in crop improvement requires enhanced attention to phenotyping, ecophysiological modelling, and validation studies to test the stability of candidate genetic regions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new method for producing a functional-structural plant model that simulates response to different growth conditions, yet does not require detailed knowledge of underlying physiology. The example used to present this method is the modelling of the mountain birch tree. This new functional-structural modelling approach is based on linking an L-system representation of the dynamic structure of the plant with a canonical mathematical model of plant function. Growth indicated by the canonical model is allocated to the structural model according to probabilistic growth rules, such as rules for the placement and length of new shoots, which were derived from an analysis of architectural data. The main advantage of the approach is that it is relatively simple compared to the prevalent process-based functional-structural plant models and does not require a detailed understanding of underlying physiological processes, yet it is able to capture important aspects of plant function and adaptability, unlike simple empirical models. This approach, combining canonical modelling, architectural analysis and L-systems, thus fills the important role of providing an intermediate level of abstraction between the two extremes of deeply mechanistic process-based modelling and purely empirical modelling. We also investigated the relative importance of various aspects of this integrated modelling approach by analysing the sensitivity of the standard birch model to a number of variations in its parameters, functions and algorithms. The results show that using light as the sole factor determining the structural location of new growth gives satisfactory results. Including the influence of additional regulating factors made little difference to global characteristics of the emergent architecture. Changing the form of the probability functions and using alternative methods for choosing the sites of new growth also had little effect. (c) 2004 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We demonstrate a portable process for developing a triple bottom line model to measure the knowledge production performance of individual research centres. For the first time, this study also empirically illustrates how a fully units-invariant model of Data Envelopment Analysis (DEA) can be used to measure the relative efficiency of research centres by capturing the interaction amongst a common set of multiple inputs and outputs. This study is particularly timely given the increasing transparency required by governments and industries that fund research activities. The process highlights the links between organisational objectives, desired outcomes and outputs while the emerging performance model represents an executive managerial view. This study brings consistency to current measures that often rely on ratios and univariate analyses that are not otherwise conducive to relative performance analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims To develop a pharmacokinetic-pharmacodynamic model describing the time-course of QT interval prolongation after citalopram overdose and to evaluate the effect of charcoal on the relative risk of developing abnormal QT and heart-rate combinations. Methods Plasma concentrations and electrocardiograph (ECG) data from 52 patients after 62 citalopram overdose events were analysed in WinBUGS using a Bayesian approach. The reported doses ranged from 20 to 1700 mg and on 17 of the events a single dose of activated charcoal was administered. The developed pharmacokinetic-pharmacodynamic model was used for predicting the probability of having abnormal combinations of QT-RR, which was assumed to be related to an increased risk for torsade de pointes (TdP). Results The absolute QT interval was related to the observed heart rate with an estimated individual heart-rate correction factor [alpha = 0.36, between-subject coefficient of variation (CV) = 29%]. The heart-rate corrected QT interval was linearly dependent on the predicted citalopram concentration (slope = 40 ms l mg(-1), between-subject CV = 70%) in a hypothetical effect-compartment (half-life of effect-delay = 1.4 h). The heart-rate corrected QT was predicted to be higher in women than in men and to increase with age. Administration of activated charcoal resulted in a pronounced reduction of the QT prolongation and was shown to reduce the risk of having abnormal combinations of QT-RR by approximately 60% for citalopram doses above 600 mg. Conclusion Citalopram caused a delayed lengthening of the QT interval. Administration of activated charcoal was shown to reduce the risk that the QT interval exceeds a previously defined threshold and therefore is expected to reduce the risk of TdP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the study presented was to implement a process model to simulate the dynamic behaviour of a pilot-scale process for anaerobic two-stage digestion of sewage sludge. The model implemented was initiated to support experimental investigations of the anaerobic two-stage digestion process. The model concept implemented in the simulation software package MATLAB(TM)/Simulink(R) is a derivative of the IWA Anaerobic Digestion Model No.1 (ADM1) that has been developed by the IWA task group for mathematical modelling of anaerobic processes. In the present study the original model concept has been adapted and applied to replicate a two-stage digestion process. Testing procedures, including balance checks and 'benchmarking' tests were carried out to verify the accuracy of the implementation. These combined measures ensured a faultless model implementation without numerical inconsistencies. Parameters for both, the thermophilic and the mesophilic process stage, have been estimated successfully using data from lab-scale experiments described in literature. Due to the high number of parameters in the structured model, it was necessary to develop a customised procedure that limited the range of parameters to be estimated. The accuracy of the optimised parameter sets has been assessed against experimental data from pilot-scale experiments. Under these conditions, the model predicted reasonably well the dynamic behaviour of a two-stage digestion process in pilot scale. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper provides information on the experimental set-up, data collection methods and results to date for the project Large scale modelling of coarse grained beaches, undertaken at the Large Wave Channel (GWK) of FZK in Hannover by an international group of researchers in Spring 2002. The main objective of the experiments was to provide full scale measurements of cross-shore processes on gravel and mixed beaches for the verification and further development of cross-shore numerical models of gravel and mixed sediment beaches. Identical random and regular wave tests were undertaken for a gravel beach and a mixed sand/gravel beach set up in the flume. Measurements included profile development, water surface elevation along the flume, internal pressures in the swash zone, piezometric head levels within the beach, run-up, flow velocities in the surf-zone and sediment size distributions. The purpose of the paper is to present to the scientific community the experimental procedure, a summary of the data collected, some initial results, as well as a brief outline of the on-going research being carried out with the data by different research groups. The experimental data is available to all the scientific community following submission of a statement of objectives, specification of data requirements and an agreement to abide with the GWK and EU protocols. (C) 2005 Elsevier B.V. All rights reserved.