58 resultados para Model-based optimization


Relevância:

90.00% 90.00%

Publicador:

Resumo:

We report the first steps of a collaborative project between the University of Queensland, Polyflow, Michelin, SK Chemicals, and RMIT University; on simulation, validation and application of a recently introduced constitutive model designed to describe branched polymers. Whereas much progress has been made on predicting the complex flow behaviour of many - in particular linear - polymers, it sometimes appears difficult to predict simultaneously shear thinning and extensional strain hardening behaviour using traditional constitutive models. Recently a new viscoelastic model based on molecular topology, was proposed by McLeish and Larson (1998). We explore the predictive power of a differential multi-mode version of the pom-pom model for the flow behaviour of two commercial polymer melts: a (long-chain branched) low-density polyethylene (LDPE) and a (linear) high-density polyethylene (HDPE). The model responses are compared to elongational recovery experiments published by Langouche and Debbaut (1999), and start-up of simple shear flow, stress relaxation after simple and reverse step strain experiments carried out in our laboratory.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Low concentrate density from wet drum magnetic separators in dense medium circuits can cause operating difficulties due to inability to obtain the required circulating medium density and, indirectly, high medium solids losses. The literature is almost silent on the processes controlling concentrate density. However, the common name for the region through which concentrate is discharged-the squeeze pan gap-implies that some extrusion process is thought to be at work. There is no model of magnetics recovery in a wet drum magnetic separator, which includes as inputs all significant machine and operating variables. A series of trials, in both factorial experiments and in single variable experiments, was done using a purpose built rig which featured a small industrial scale (700 mm lip length, 900 turn diameter) wet drum magnetic separator. A substantial data set of 191 trials was generated in this work. The results of the factorial experiments were used to identify the variables having a significant effect on magnetics recovery. It is proposed, based both on the experimental observations of the present work and on observations reported in the literature, that the process controlling magnetic separator concentrate density is one of drainage. Such a process should be able to be defined by an initial moisture, a drainage rate and a drainage time, the latter being defined by the volumetric flowrate and the volume within the drainage zone. The magnetics can be characterised by an experimentally derived ultimate drainage moisture. A model based on these concepts and containing adjustable parameters was developed. This model was then fitted to a randomly chosen 80% of the data, and validated by application to the remaining 20%. The model is shown to be a good fit to data over concentrate solids content values from 40% solids to 80% solids and for both magnetite and ferrosilicon feeds. (C) 2003 Elsevier Science B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Loss of magnetic medium solids from dense medium circuits is a substantial contributor to operating cost. Much of this loss is by way of wet drum magnetic separator effluent. A model of the separator would be useful for process design, optimisation and control. A review of the literature established that although various rules of thumb exist, largely based on empirical or anecdotal evidence, there is no model of magnetics recovery in a wet drum magnetic separator which includes as inputs all significant machine and operating variables. A series of trials, in both factorial experiments and in single variable experiments, was therefore carried out using a purpose built rig which featured a small industrial scale (700 mm lip length, 900 mm diameter) wet drum magnetic separator. A substantial data set of 191 trials was generated in the work. The results of the factorial experiments were used to identify the variables having a significant effect on magnetics recovery. Observations carried out as an adjunct to this work, as well as magnetic theory, suggests that the capture of magnetic particles in the wet drum magnetic separator is by a flocculation process. Such a process should be defined by a flocculation rate and a flocculation time; the latter being defined by the volumetric flowrate and the volume within the separation zone. A model based on this concept and containing adjustable parameters was developed. This model was then fitted to a randomly chosen 80% of the data, and validated by application to the remaining 20%. The model is shown to provide a satisfactory fit to the data over three orders of magnitude of magnetics loss. (C) 2003 Elsevier Science BY. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents a new model based on thermodynamic and molecular interaction between molecules to describe the vapour-liquid phase equilibria and surface tension of pure component. The model assumes that the bulk fluid can be characterised as set of parallel layers. Because of this molecular structure, we coin the model as the molecular layer structure theory (MLST). Each layer has two energetic components. One is the interaction energy of one molecule of that layer with all surrounding layers. The other component is the intra-layer Helmholtz free energy, which accounts for the internal energy and the entropy of that layer. The equilibrium between two separating phases is derived from the minimum of the grand potential, and the surface tension is calculated as the excess of the Helmholtz energy of the system. We test this model with a number of components, argon, krypton, ethane, n-butane, iso-butane, ethylene and sulphur hexafluoride, and the results are very satisfactory. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In a previous paper, Hoornaert et al. (Powder Technol. 96 (1998); 116-128) presented data from granulation experiments performed in a 50 L Lodige high shear mixer. In this study that same data was simulated with a population balance model. Based on an analysis of the experimental data, the granulation process was divided into three separate stages: nucleation, induction, and coalescence growth. These three stages were then simulated separately, with promising results. it is possible to derive a kernel that fit both the induction and the coalescence growth stage. Modeling the nucleation stage proved to be more challenging due to the complex mechanism of nucleus formation. From this work some recommendations are made for the improvement of this type of model.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We examine the event statistics obtained from two differing simplified models for earthquake faults. The first model is a reproduction of the Block-Slider model of Carlson et al. (1991), a model often employed in seismicity studies. The second model is an elastodynamic fault model based upon the Lattice Solid Model (LSM) of Mora and Place (1994). We performed simulations in which the fault length was varied in each model and generated synthetic catalogs of event sizes and times. From these catalogs, we constructed interval event size distributions and inter-event time distributions. The larger, localised events in the Block-Slider model displayed the same scaling behaviour as events in the LSM however the distribution of inter-event times was markedly different. The analysis of both event size and inter-event time statistics is an effective method for comparative studies of differing simplified models for earthquake faults.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background Reliable information on causes of death is a fundamental component of health development strategies, yet globally only about one-third of countries have access to such information. For countries currently without adequate mortality reporting systems there are useful models other than resource-intensive population-wide medical certification. Sample-based mortality surveillance is one such approach. This paper provides methods for addressing appropriate sample size considerations in relation to mortality surveillance, with particular reference to situations in which prior information on mortality is lacking. Methods The feasibility of model-based approaches for predicting the expected mortality structure and cause composition is demonstrated for populations in which only limited empirical data is available. An algorithm approach is then provided to derive the minimum person-years of observation needed to generate robust estimates for the rarest cause of interest in three hypothetical populations, each representing different levels of health development. Results Modelled life expectancies at birth and cause of death structures were within expected ranges based on published estimates for countries at comparable levels of health development. Total person-years of observation required in each population could be more than halved by limiting the set of age, sex, and cause groups regarded as 'of interest'. Discussion The methods proposed are consistent with the philosophy of establishing priorities across broad clusters of causes for which the public health response implications are similar. The examples provided illustrate the options available when considering the design of mortality surveillance for population health monitoring purposes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Automatic signature verification is a well-established and an active area of research with numerous applications such as bank check verification, ATM access, etc. This paper proposes a novel approach to the problem of automatic off-line signature verification and forgery detection. The proposed approach is based on fuzzy modeling that employs the Takagi-Sugeno (TS) model. Signature verification and forgery detection are carried out using angle features extracted from box approach. Each feature corresponds to a fuzzy set. The features are fuzzified by an exponential membership function involved in the TS model, which is modified to include structural parameters. The structural parameters are devised to take account of possible variations due to handwriting styles and to reflect moods. The membership functions constitute weights in the TS model. The optimization of the output of the TS model with respect to the structural parameters yields the solution for the parameters. We have also derived two TS models by considering a rule for each input feature in the first formulation (Multiple rules) and by considering a single rule for all input features in the second formulation. In this work, we have found that TS model with multiple rules is better than TS model with single rule for detecting three types of forgeries; random, skilled and unskilled from a large database of sample signatures in addition to verifying genuine signatures. We have also devised three approaches, viz., an innovative approach and two intuitive approaches using the TS model with multiple rules for improved performance. (C) 2004 Pattern Recognition Society. Published by Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Motivation: The clustering of gene profiles across some experimental conditions of interest contributes significantly to the elucidation of unknown gene function, the validation of gene discoveries and the interpretation of biological processes. However, this clustering problem is not straightforward as the profiles of the genes are not all independently distributed and the expression levels may have been obtained from an experimental design involving replicated arrays. Ignoring the dependence between the gene profiles and the structure of the replicated data can result in important sources of variability in the experiments being overlooked in the analysis, with the consequent possibility of misleading inferences being made. We propose a random-effects model that provides a unified approach to the clustering of genes with correlated expression levels measured in a wide variety of experimental situations. Our model is an extension of the normal mixture model to account for the correlations between the gene profiles and to enable covariate information to be incorporated into the clustering process. Hence the model is applicable to longitudinal studies with or without replication, for example, time-course experiments by using time as a covariate, and to cross-sectional experiments by using categorical covariates to represent the different experimental classes. Results: We show that our random-effects model can be fitted by maximum likelihood via the EM algorithm for which the E(expectation) and M(maximization) steps can be implemented in closed form. Hence our model can be fitted deterministically without the need for time-consuming Monte Carlo approximations. The effectiveness of our model-based procedure for the clustering of correlated gene profiles is demonstrated on three real datasets, representing typical microarray experimental designs, covering time-course, repeated-measurement and cross-sectional data. In these examples, relevant clusters of the genes are obtained, which are supported by existing gene-function annotation. A synthetic dataset is considered too.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The ‘leading coordinate’ approach to computing an approximate reaction pathway, with subsequent determination of the true minimum energy profile, is applied to a two-proton chain transfer model based on the chromophore and its surrounding moieties within the green fluorescent protein (GFP). Using an ab initio quantum chemical method, a number of different relaxed energy profiles are found for several plausible guesses at leading coordinates. The results obtained for different trial leading coordinates are rationalized through the calculation of a two-dimensional relaxed potential energy surface (PES) for the system. Analysis of the 2-D relaxed PES reveals that two of the trial pathways are entirely spurious, while two others contain useful information and can be used to furnish starting points for successful saddle-point searches. Implications for selection of trial leading coordinates in this class of proton chain transfer reactions are discussed, and a simple diagnostic function is proposed for revealing whether or not a relaxed pathway based on a trial leading coordinate is likely to furnish useful information.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We often need to estimate the size of wild populations to determine the appropriate management action, for example, to set a harvest quota. Monitoring is usually planned under the assumption that it must be carried out at fixed intervals in time, typically annually, before the harvest quota is set. However, monitoring can be very expensive, and we should weigh the cost of monitoring against the improvement that it makes in decision making. A less costly alternative to monitoring annually is to predict the population size using a population model and information from previous surveys. In this paper, the problem of monitoring frequency is posed within a decision-theory framework. We discover that a monitoring regime that varies according to the state of the system call outperform fixed-interval monitoring This idea is illustrated using data for a red kangaroo (Macropits rufus) population in South Australia. Whether or not one should monitor in a given year is dependent on the estimated population density in the previous year, the uncertainty in that population estimate, and past rainfall. We discover that monitoring is-important when a model-based prediction of population density is very uncertain. This may occur if monitoring has not taken place for several years, or if rainfall has been above average. Monitoring is also important when prior information suggests that the population is near a critical threshold in population abundance. However, monitoring is less important when the optimal management action would not be altered by new information.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Many developing south-east Asian governments are not capturing full rent from domestic forest logging operations. Such rent losses are commonly related to institutional failures, where informal institutions tend to dominate the control of forestry activity in spite of weakly enforced regulations. Our model is an attempt to add a new dimension to thinking about deforestation. We present a simple conceptual model, based on individual decisions rather than social or forest planning, which includes the human dynamics of participation in informal activity and the relatively slower ecological dynamics of changes in forest resources. We demonstrate how incumbent informal logging operations can be persistent, and that any spending aimed at replacing the informal institutions can only be successful if it pushes institutional settings past some threshold. (C) 2006 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Model transformations are an integral part of model-driven development. Incremental updates are a key execution scenario for transformations in model-based systems, and are especially important for the evolution of such systems. This paper presents a strategy for the incremental maintenance of declarative, rule-based transformation executions. The strategy involves recording dependencies of the transformation execution on information from source models and from the transformation definition. Changes to the source models or the transformation itself can then be directly mapped to their effects on transformation execution, allowing changes to target models to be computed efficiently. This particular approach has many benefits. It supports changes to both source models and transformation definitions, it can be applied to incomplete transformation executions, and a priori knowledge of volatility can be used to further increase the efficiency of change propagation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We report on a quantitative study of the growth process of 87Rb Bose-Einstein condensates. By continuous evaporative cooling we directly control the thermal cloud from which the condensate grows. We compare the experimental data with the results of a theoretical model based on quantum kinetic theory. We find quantitative agreement with theory for the situation of strong cooling, whereas in the weak cooling regime a distinctly different behavior is found in the experiment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In humans, intra-abdominal pressure (IAP) is elevated during many everyday activities. This experiment aimed to investigate the extent to which increased IAP-without concurrent activity of the abdominal or back extensor muscles-produces an extensor torque. With subjects positioned in side lying on a swivel table with its axis at L3, moments about this vertebral level were measured when IAP was transiently increased by electrical stimulation of the diaphragm via the phrenic nerve. There was no electromyographic activity in abdominal and back extensor muscles. When IAP was increased artificially to similar to 15% of the maximum IAP amplitude that could be generated voluntarily with the trunk positioned in flexion, a trunk extensor moment (similar to6 Nm) was recorded. The size of the effect was proportional to the increase in pressure. The extensor moment was consistent with that predicted from a model based on measurements of abdominal cross-sectional area and IAP moment arm. When IAP was momentarily increased while the trunk was flexed passively at a constant velocity, the external torque required to maintain the velocity was increased. These results provide the first in vivo data of the amplitude of extensor moment that is produced by increased IAP. Although the net effect of this extensor torque in functional tasks would be dependent on the muscles used to increase the IAP and their associated flexion torque, the data do provide evidence that IAP contributes, at least in part, to spinal stability. (C) 2001 Elsevier Science Ltd. All rights reserved.