65 resultados para Model-Based Design


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Motivation: The clustering of gene profiles across some experimental conditions of interest contributes significantly to the elucidation of unknown gene function, the validation of gene discoveries and the interpretation of biological processes. However, this clustering problem is not straightforward as the profiles of the genes are not all independently distributed and the expression levels may have been obtained from an experimental design involving replicated arrays. Ignoring the dependence between the gene profiles and the structure of the replicated data can result in important sources of variability in the experiments being overlooked in the analysis, with the consequent possibility of misleading inferences being made. We propose a random-effects model that provides a unified approach to the clustering of genes with correlated expression levels measured in a wide variety of experimental situations. Our model is an extension of the normal mixture model to account for the correlations between the gene profiles and to enable covariate information to be incorporated into the clustering process. Hence the model is applicable to longitudinal studies with or without replication, for example, time-course experiments by using time as a covariate, and to cross-sectional experiments by using categorical covariates to represent the different experimental classes. Results: We show that our random-effects model can be fitted by maximum likelihood via the EM algorithm for which the E(expectation) and M(maximization) steps can be implemented in closed form. Hence our model can be fitted deterministically without the need for time-consuming Monte Carlo approximations. The effectiveness of our model-based procedure for the clustering of correlated gene profiles is demonstrated on three real datasets, representing typical microarray experimental designs, covering time-course, repeated-measurement and cross-sectional data. In these examples, relevant clusters of the genes are obtained, which are supported by existing gene-function annotation. A synthetic dataset is considered too.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Multiple sampling is widely used in vadose zone percolation experiments to investigate the extent in which soil structure heterogeneities influence the spatial and temporal distributions of water and solutes. In this note, a simple, robust, mathematical model, based on the beta-statistical distribution, is proposed as a method of quantifying the magnitude of heterogeneity in such experiments. The model relies on fitting two parameters, alpha and zeta to the cumulative elution curves generated in multiple-sample percolation experiments. The model does not require knowledge of the soil structure. A homogeneous or uniform distribution of a solute and/or soil-water is indicated by alpha = zeta = 1, Using these parameters, a heterogeneity index (HI) is defined as root 3 times the ratio of the standard deviation and mean. Uniform or homogeneous flow of water or solutes is indicated by HI = 1 and heterogeneity is indicated by HI > 1. A large value for this index may indicate preferential flow. The heterogeneity index relies only on knowledge of the elution curves generated from multiple sample percolation experiments and is, therefore, easily calculated. The index may also be used to describe and compare the differences in solute and soil-water percolation from different experiments. The use of this index is discussed for several different leaching experiments. (C) 1999 Elsevier Science B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The conventional convection-dispersion (also called axial dispersion) model is widely used to interrelate hepatic availability (F) and clearance (Cl) with the morphology and physiology of the liver and to predict effects such as changes in liver blood flow on F and Cl. An extended form of the convection-dispersion model has been developed to adequately describe the outflow concentration-time profiles for vascular markers at both short and long times after bolus injections into perfused livers. The model, based on flux concentration and a convolution of catheters and large vessels, assumes that solute elimination in hepatocytes follows either fast distribution into or radial diffusion in hepatocytes. The model includes a secondary vascular compartment, postulated to be interconnecting sinusoids. Analysis of the mean hepatic transit time (MTT) and normalized variance (CV2) of solutes with extraction showed that the discrepancy between the predictions of MTT and CV2 for the extended and conventional models are essentially identical irrespective of the magnitude of rate constants representing permeability, volume, and clearance parameters, providing that there is significant hepatic extraction. In conclusion, the application of a newly developed extended convection-dispersion model has shown that the unweighted conventional convection-dispersion model can be used to describe the disposition of extracted solutes and, in particular, to estimate hepatic availability and clearance in booth experimental and clinical situations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The occurrence of foliated rock masses is common in mining environment. Methods employing continuum approximation in describing the deformation of such rock masses possess a clear advantage over methods where each rock layer and each inter-layer interface (joint) is explicitly modelled. In devising such a continuum model it is imperative that moment (couple) stresses and internal rotations associated with the bending of the rock layers be properly incorporated in the model formulation. Such an approach will lead to a Cosserat-type theory. In the present model, the behaviour of the intact rock layer is assumed to be linearly elastic and the joints are assumed to be elastic-perfectly plastic. Condition of slip at the interfaces are determined by a Mohr-Coulomb criterion with tension cut off at zero normal stress. The theory is valid for large deformations. The model is incorporated into the finite element program AFENA and validated against an analytical solution of elementary buckling problems of a layered medium under gravity loading. A design chart suitable for assessing the stability of slopes in foliated rock masses against flexural buckling failure has been developed. The design chart is easy to use and provides a quick estimate of critical loading factors for slopes in foliated rock masses. It is shown that the model based on Euler's buckling theory as proposed by Cavers (Rock Mechanics and Rock Engineering 1981; 14:87-104) substantially overestimates the critical heights for a vertical slope and underestimates the same for sub-vertical slopes. Copyright (C) 2001 John Wiley & Sons, Ltd.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Test templates and a test template framework are introduced as useful concepts in specification-based testing. The framework can be defined using any model-based specification notation and used to derive tests from model-based specifications-in this paper, it is demonstrated using the Z notation. The framework formally defines test data sets and their relation to the operations in a specification and to other test data sets, providing structure to the testing process. Flexibility is preserved, so that many testing strategies can be used. Important application areas of the framework are discussed, including refinement of test data, regression testing, and test oracles.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A mixture model incorporating long-term survivors has been adopted in the field of biostatistics where some individuals may never experience the failure event under study. The surviving fractions may be considered as cured. In most applications, the survival times are assumed to be independent. However, when the survival data are obtained from a multi-centre clinical trial, it is conceived that the environ mental conditions and facilities shared within clinic affects the proportion cured as well as the failure risk for the uncured individuals. It necessitates a long-term survivor mixture model with random effects. In this paper, the long-term survivor mixture model is extended for the analysis of multivariate failure time data using the generalized linear mixed model (GLMM) approach. The proposed model is applied to analyse a numerical data set from a multi-centre clinical trial of carcinoma as an illustration. Some simulation experiments are performed to assess the applicability of the model based on the average biases of the estimates formed. Copyright (C) 2001 John Wiley & Sons, Ltd.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A dynamic modelling methodology, which combines on-line variable estimation and parameter identification with physical laws to form an adaptive model for rotary sugar drying processes, is developed in this paper. In contrast to the conventional rate-based models using empirical transfer coefficients, the heat and mass transfer rates are estimated by using on-line measurements in the new model. Furthermore, a set of improved sectional solid transport equations with localized parameters is developed in this work to reidentified on-line using measurement data, the model is able to closely track the dynamic behaviour of rotary drying processes within a broad range of operational conditions. This adaptive model is validated against experimental data obtained from a pilot-scale rotary sugar dryer. The proposed modelling methodology can be easily incorporated into nonlinear model based control schemes to form a unified modelling and control framework.place the global correlation for the computation of solid retention time. Since a number of key model variables and parameters are identified on-line using measurement data, the model is able to closely track the dynamic behaviour of rotary drying processes within a broad range of operational conditions. This adaptive model is validated against experimental data obtained from a pilot-scale rotary sugar dryer. The proposed modelling methodology can be easily incorporated into nonlinear model based control schemes to form a unified modelling and control framework.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the limit state design (LSD) method each design criterion is formally stated and assessed using a performance function. The performance function defines the relationship between the design parameters and the design criterion. In practice, LSD involves factoring up loads and factoring down calculated strengths and material parameters. This provides a convenient way to carry out routine probabilistic-based design. The factors are statistically calculated to produce a design with an acceptably low probability of failure. Hence the ultimate load and the design material properties are mathematical concepts that have no physical interpretation. They may be physically impossible. Similarly, the appropriate analysis model is also defined by the performance function and may not describe the real behaviour at the perceived physical equivalent limit condition. These points must be understood to avoid confusion in the discussion and application of partial factor LSD methods.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We report the first steps of a collaborative project between the University of Queensland, Polyflow, Michelin, SK Chemicals, and RMIT University; on simulation, validation and application of a recently introduced constitutive model designed to describe branched polymers. Whereas much progress has been made on predicting the complex flow behaviour of many - in particular linear - polymers, it sometimes appears difficult to predict simultaneously shear thinning and extensional strain hardening behaviour using traditional constitutive models. Recently a new viscoelastic model based on molecular topology, was proposed by McLeish and Larson (1998). We explore the predictive power of a differential multi-mode version of the pom-pom model for the flow behaviour of two commercial polymer melts: a (long-chain branched) low-density polyethylene (LDPE) and a (linear) high-density polyethylene (HDPE). The model responses are compared to elongational recovery experiments published by Langouche and Debbaut (1999), and start-up of simple shear flow, stress relaxation after simple and reverse step strain experiments carried out in our laboratory.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Low concentrate density from wet drum magnetic separators in dense medium circuits can cause operating difficulties due to inability to obtain the required circulating medium density and, indirectly, high medium solids losses. The literature is almost silent on the processes controlling concentrate density. However, the common name for the region through which concentrate is discharged-the squeeze pan gap-implies that some extrusion process is thought to be at work. There is no model of magnetics recovery in a wet drum magnetic separator, which includes as inputs all significant machine and operating variables. A series of trials, in both factorial experiments and in single variable experiments, was done using a purpose built rig which featured a small industrial scale (700 mm lip length, 900 turn diameter) wet drum magnetic separator. A substantial data set of 191 trials was generated in this work. The results of the factorial experiments were used to identify the variables having a significant effect on magnetics recovery. It is proposed, based both on the experimental observations of the present work and on observations reported in the literature, that the process controlling magnetic separator concentrate density is one of drainage. Such a process should be able to be defined by an initial moisture, a drainage rate and a drainage time, the latter being defined by the volumetric flowrate and the volume within the drainage zone. The magnetics can be characterised by an experimentally derived ultimate drainage moisture. A model based on these concepts and containing adjustable parameters was developed. This model was then fitted to a randomly chosen 80% of the data, and validated by application to the remaining 20%. The model is shown to be a good fit to data over concentrate solids content values from 40% solids to 80% solids and for both magnetite and ferrosilicon feeds. (C) 2003 Elsevier Science B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents a new model based on thermodynamic and molecular interaction between molecules to describe the vapour-liquid phase equilibria and surface tension of pure component. The model assumes that the bulk fluid can be characterised as set of parallel layers. Because of this molecular structure, we coin the model as the molecular layer structure theory (MLST). Each layer has two energetic components. One is the interaction energy of one molecule of that layer with all surrounding layers. The other component is the intra-layer Helmholtz free energy, which accounts for the internal energy and the entropy of that layer. The equilibrium between two separating phases is derived from the minimum of the grand potential, and the surface tension is calculated as the excess of the Helmholtz energy of the system. We test this model with a number of components, argon, krypton, ethane, n-butane, iso-butane, ethylene and sulphur hexafluoride, and the results are very satisfactory. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In a previous paper, Hoornaert et al. (Powder Technol. 96 (1998); 116-128) presented data from granulation experiments performed in a 50 L Lodige high shear mixer. In this study that same data was simulated with a population balance model. Based on an analysis of the experimental data, the granulation process was divided into three separate stages: nucleation, induction, and coalescence growth. These three stages were then simulated separately, with promising results. it is possible to derive a kernel that fit both the induction and the coalescence growth stage. Modeling the nucleation stage proved to be more challenging due to the complex mechanism of nucleus formation. From this work some recommendations are made for the improvement of this type of model.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Vermicompost filtration is a new on-site waste treatment system. Consequently, little is known about the filter medium properties. The aim of this preliminary study was to quantify physical and compositional properties of vermicompost filter beds that had been used to treat domestic solid organic waste and wastewater. This paper presents the trials performed on pilot-scale reactors filled with vermicompost from a full-scale vermicompost filtration system. Household solid organic waste and raw wastewater at the rate of 130 L/m(2)/d was applied to the reactor bed surface over a four-month period. It was found that fresh casts laid on the bed surface had a BOD of 1290 mg/g VS while casts buried to a depth of 10 cm had a BOD of 605 mg/g VS. Below this depth there was little further biodegradation of earthworm casts despite cast ages of up to five years. Solid material in the reactor accounted for only 7-10% of the reactor volume. The total voidage comprised of large free-draining pores, which accounted for 15-20% of the reactor volume and 60-70% micropores, able to hold up water against gravity. It was shown that water could flow through the medium micropores and macropores following a wastewater application. The wastewater flow characteristics were modeled by a two-region model based on the Richards Equation, an equation used to describe porous spatially heterogeneous materials.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

alpha-Conotoxins, from cone snails, and alpha-neurotoxins, from snakes, are competitive inhibitors of nicotinic acetylcholine receptors (nAChRs) that have overlapping binding sites in the ACh binding pocket. These disulphide-rich peptides are used extensively as tools to localize and pharmacologically characterize specific nAChRs subtypes. Recently, a homology model based on the high-resolution structure of an ACh binding protein (AChBP) allowed the three-fingered alpha-neurotoxins to be docked onto the alpha7 nAChR. To investigate if alpha-conotoxins interact with the nAChR in a similar manner, we built homology models of human alpha7 and alpha3beta2 nAChRs, and performed docking simulations of alpha-conotoxins ImI, PnIB, PnIA and MII using the program GOLD. Docking revealed that alpha-conotoxins have a different mode of interaction compared with alpha-neurotoxins, with surprisingly few nAChR residues in common between their overlapping binding sites. These docking experiments show that Imi and PnIB bind to the ACh binding pocket via a small cavity located above the beta9/beta10 hairpin of the (+)alpha7 nAChR subunit. Interestingly, PnIB, PnIA and MII were found to bind in a similar location on alpha7 or alpha3beta2 receptors mostly through hydrophobic interactions, while ImI bound further from the ACh binding pocket, mostly through electrostatic interactions. These findings, which distinguish alpha-conotoxin and alpha-neurotoxin binding modes, have implications for the rational design of selective nAChR antagonists. Copyright (C) 2004 John Wiley Sons, Ltd.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We examine the event statistics obtained from two differing simplified models for earthquake faults. The first model is a reproduction of the Block-Slider model of Carlson et al. (1991), a model often employed in seismicity studies. The second model is an elastodynamic fault model based upon the Lattice Solid Model (LSM) of Mora and Place (1994). We performed simulations in which the fault length was varied in each model and generated synthetic catalogs of event sizes and times. From these catalogs, we constructed interval event size distributions and inter-event time distributions. The larger, localised events in the Block-Slider model displayed the same scaling behaviour as events in the LSM however the distribution of inter-event times was markedly different. The analysis of both event size and inter-event time statistics is an effective method for comparative studies of differing simplified models for earthquake faults.