993 resultados para LABORATORY MODELS
Resumo:
Supersymmetric t-J Gaudin models with open boundary conditions are investigated by means of the algebraic Bethe ansatz method. Off-shell Bethe ansatz equations of the boundary Gaudin systems are derived, and used to construct and solve the KZ equations associated with sl (2\1)((1)) superalgebra.
Resumo:
As inorganic arsenic is a proven human carcinogen, significant effort has been made in recent decades in an attempt to understand arsenic carcinogenesis using animal models, including rodents (rats and mice) and larger mammals such as beagles and monkeys. Transgenic animals were also used to test the carcinogenic effect of arsenicals, but until recently all models had failed to mimic satisfactorily the actual mechanism of arsenic carcinogenicity. However, within the past decade successful animal models have been developed using the most common strains of mice or rats. Thus dimethylarsinic acid (DMA), an organic arsenic compound which is the major metabolite of inorganic arsenicals in mammals, has been proven to be tumorigenic in such animals. Reports of successful cancer induction in animals by inorganic arsenic (arsenite and arsenate) have been rare, and most carcinogenetic studies have used organic arsenicals such as DMA combined with other tumor initiators. Although such experiments used high concentrations. of arsenicals for the promotion of tumors, animal models using doses of arsenicals species closed to the exposure level of humans in endemic areas are obviously the most significant. Almost all researchers have used drinking water or food as the pathway for the development of animal model test systems in order to mimic chronic arsenic poisoning in humans; such pathways seem more likely to achieve desirable results. (C) 2002 Elsevier Science Ireland Ltd. All rights reserved.
Resumo:
Developments in computer and three dimensional (3D) digitiser technologies have made it possible to keep track of the broad range of data required to simulate an insect moving around or over the highly heterogeneous habitat of a plant's surface. Properties of plant parts vary within a complex canopy architecture, and insect damage can induce further changes that affect an animal's movements, development and likelihood of survival. Models of plant architectural development based on Lindenmayer systems (L-systems) serve as dynamic platforms for simulation of insect movement, providing ail explicit model of the developing 3D structure of a plant as well as allowing physiological processes associated with plant growth and responses to damage to be described and Simulated. Simple examples of the use of the L-system formalism to model insect movement, operating Lit different spatial scales-from insects foraging on an individual plant to insects flying around plants in a field-are presented. Such models can be used to explore questions about the consequences of changes in environmental architecture and configuration on host finding, exploitation and its population consequences. In effect this model is a 'virtual ecosystem' laboratory to address local as well as landscape-level questions pertinent to plant-insect interactions, taking plant architecture into account. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Models of plant architecture allow us to explore how genotype environment interactions effect the development of plant phenotypes. Such models generate masses of data organised in complex hierarchies. This paper presents a generic system for creating and automatically populating a relational database from data generated by the widely used L-system approach to modelling plant morphogenesis. Techniques from compiler technology are applied to generate attributes (new fields) in the database, to simplify query development for the recursively-structured branching relationship. Use of biological terminology in an interactive query builder contributes towards making the system biologist-friendly. (C) 2002 Elsevier Science Ireland Ltd. All rights reserved.
Resumo:
Here we consider the role of abstract models in advancing our understanding of movement pathology. Models of movement coordination and control provide the frameworks necessary for the design and interpretation of studies of acquired and developmental disorders. These models do not however provide the resolution necessary to reveal the nature of the functional impairments that characterise specific movement pathologies. In addition, they do not provide a mapping between the structural bases of various pathologies and the associated disorders of movement. Current and prospective approaches to the study and treatment of movement disorders are discussed. It is argued that the appreciation of structure-function relationships, to which these approaches give rise, represents a challenge to current models of interlimb coordination, and a stimulus for their continued development. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
We use published and new trace element data to identify element ratios which discriminate between arc magmas from the supra-subduction zone mantle wedge and those formed by direct melting of subducted crust (i.e. adakites). The clearest distinction is obtained with those element ratios which are strongly fractionated during refertilisation of the depleted mantle wedge, ultimately reflecting slab dehydration. Hence, adakites have significantly lower Pb/Nd and B/Be but higher Nb/Ta than typical arc magmas and continental crust as a whole. Although Li and Be are also overenriched in continental crust, behaviour of Li/Yb and Be/Nd is more complex and these ratios do not provide unique signatures of slab melting. Archaean tonalite-trondhjemite-granodiorites (TTGs) strongly resemble ordinary mantle wedge-derived arc magmas in terms of fluid-mobile trace element content, implying that they-did not form by slab melting but that they originated from mantle which was hydrated and enriched in elements lost from slabs during prograde dehydration. We suggest that Archaean TTGs formed by extensive fractional crystallisation from a mafic precursor. It is widely claimed that the time between the creation and subduction of oceanic lithosphere was significantly shorter in the Archaean (i.e. 20 Ma) than it is today. This difference was seen as an attractive explanation for the presumed preponderance of adakitic magmas during the first half of Earth's history. However, when we consider the effects of a higher potential mantle temperature on the thickness of oceanic crust, it follows that the mean age of oceanic lithosphere has remained virtually constant. Formation of adakites has therefore always depended on local plate geometry and not on potential mantle temperature.
Resumo:
This study compared an enzyme-linked immunosorbent assay (ELISA) to a liquid chromatography-tandem mass spectrometry (LC/MS/MS) technique for measurement of tacrolimus concentrations in adult kidney and liver transplant recipients, and investigated how assay choice influenced pharmacokinetic parameter estimates and drug dosage decisions. Tacrolimus concentrations measured by both ELISA and LC/MS/MS from 29 kidney (n = 98 samples) and 27 liver (n = 97 samples) transplant recipients were used to evaluate the performance of these methods in the clinical setting. Tacrolimus concentrations measured by the two techniques were compared via regression analysis. Population pharmacokinetic models were developed independently using ELISA and LC/MS/MS data from 76 kidney recipients. Derived kinetic parameters were used to formulate typical dosing regimens for concentration targeting. Dosage recommendations for the two assays were compared. The relation between LC/MS/MS and ELISA measurements was best described by the regression equation ELISA = 1.02 . (LC/MS/MS) + 0.14 in kidney recipients, and ELISA = 1.12 . (LC/MS/MS) - 0.87 in liver recipients. ELISA displayed less accuracy than LC/MS/MS at lower tacrolimus concentrations. Population pharmacokinetic models based on ELISA and LC/MS/MS data were similar with residual random errors of 4.1 ng/mL and 3.7 ng/mL, respectively. Assay choice gave rise to dosage prediction differences ranging from 0% to 30%. ELISA measurements of tacrolimus are not automatically interchangeable with LC/MS/MS values. Assay differences were greatest in adult liver recipients, probably reflecting periods of liver dysfunction and impaired biliary secretion of metabolites. While the majority of data collected in this study suggested assay differences in adult kidney recipients were minimal, findings of ELISA dosage underpredictions of up to 25% in the long term must be investigated further.
Resumo:
The thin-layer drying behaviour of bananas in a beat pump dehumidifier dryer was examined. Four pre-treatments (blanching, chilling, freezing and combined blanching and freezing) were applied to the bananas, which were dried at 50 degreesC with an air velocity of 3.1 m s(-1) and with the relative humidity of the inlet air of 10-35%. Three drying models, the simple model, the two-term exponential model and the Page model were examined. All models were evaluated using three statistical measures, correlation coefficient, root means square error, and mean absolute percent error. Moisture diffusivity was calculated based on the diffusion equation for an infinite cylindrical shape using the slope method. The rate of drying was higher for the pre-treatments involving freezing. The sample which was blanched only did not show any improvement in drying rate. In fact, a longer drying time resulted due to water absorption during blanching. There was no change in the rate for the chilled sample compared with the control. While all models closely fitted the drying data, the simple model showed greatest deviation from the experimental results. The two-term exponential model was found to be the best model for describing the drying curves of bananas because its parameters represent better the physical characteristics of the drying process. Moisture diffusivities of bananas were in the range 4.3-13.2 x 10(-10) m(2)s(-1). (C) 2002 Published by Elsevier Science Ltd.
Resumo:
We report the first steps of a collaborative project between the University of Queensland, Polyflow, Michelin, SK Chemicals, and RMIT University; on simulation, validation and application of a recently introduced constitutive model designed to describe branched polymers. Whereas much progress has been made on predicting the complex flow behaviour of many - in particular linear - polymers, it sometimes appears difficult to predict simultaneously shear thinning and extensional strain hardening behaviour using traditional constitutive models. Recently a new viscoelastic model based on molecular topology, was proposed by McLeish and Larson (1998). We explore the predictive power of a differential multi-mode version of the pom-pom model for the flow behaviour of two commercial polymer melts: a (long-chain branched) low-density polyethylene (LDPE) and a (linear) high-density polyethylene (HDPE). The model responses are compared to elongational recovery experiments published by Langouche and Debbaut (1999), and start-up of simple shear flow, stress relaxation after simple and reverse step strain experiments carried out in our laboratory.
Resumo:
It has been argued that power-law time-to-failure fits for cumulative Benioff strain and an evolution in size-frequency statistics in the lead-up to large earthquakes are evidence that the crust behaves as a Critical Point (CP) system. If so, intermediate-term earthquake prediction is possible. However, this hypothesis has not been proven. If the crust does behave as a CP system, stress correlation lengths should grow in the lead-up to large events through the action of small to moderate ruptures and drop sharply once a large event occurs. However this evolution in stress correlation lengths cannot be observed directly. Here we show, using the lattice solid model to describe discontinuous elasto-dynamic systems subjected to shear and compression, that it is for possible correlation lengths to exhibit CP-type evolution. In the case of a granular system subjected to shear, this evolution occurs in the lead-up to the largest event and is accompanied by an increasing rate of moderate-sized events and power-law acceleration of Benioff strain release. In the case of an intact sample system subjected to compression, the evolution occurs only after a mature fracture system has developed. The results support the existence of a physical mechanism for intermediate-term earthquake forecasting and suggest this mechanism is fault-system dependent. This offers an explanation of why accelerating Benioff strain release is not observed prior to all large earthquakes. The results prove the existence of an underlying evolution in discontinuous elasto-dynamic, systems which is capable of providing a basis for forecasting catastrophic failure and earthquakes.
Resumo:
We introduce a conceptual model for the in-plane physics of an earthquake fault. The model employs cellular automaton techniques to simulate tectonic loading, earthquake rupture, and strain redistribution. The impact of a hypothetical crustal elastodynamic Green's function is approximated by a long-range strain redistribution law with a r(-p) dependance. We investigate the influence of the effective elastodynamic interaction range upon the dynamical behaviour of the model by conducting experiments with different values of the exponent (p). The results indicate that this model has two distinct, stable modes of behaviour. The first mode produces a characteristic earthquake distribution with moderate to large events preceeded by an interval of time in which the rate of energy release accelerates. A correlation function analysis reveals that accelerating sequences are associated with a systematic, global evolution of strain energy correlations within the system. The second stable mode produces Gutenberg-Richter statistics, with near-linear energy release and no significant global correlation evolution. A model with effectively short-range interactions preferentially displays Gutenberg-Richter behaviour. However, models with long-range interactions appear to switch between the characteristic and GR modes. As the range of elastodynamic interactions is increased, characteristic behaviour begins to dominate GR behaviour. These models demonstrate that evolution of strain energy correlations may occur within systems with a fixed elastodynamic interaction range. Supposing that similar mode-switching dynamical behaviour occurs within earthquake faults then intermediate-term forecasting of large earthquakes may be feasible for some earthquakes but not for others, in alignment with certain empirical seismological observations. Further numerical investigation of dynamical models of this type may lead to advances in earthquake forecasting research and theoretical seismology.
Resumo:
The particle-based Lattice Solid Model (LSM) was developed to provide a basis to study the physics of rocks and the nonlinear dynamics of earthquakes (MORA and PLACE, 1994; PLACE and MORA, 1999). A new modular and flexible LSM approach has been developed that allows different microphysics to be easily included in or removed from the model. The approach provides a virtual laboratory where numerical experiments can easily be set up and all measurable quantities visualised. The proposed approach provides a means to simulate complex phenomena such as fracturing or localisation processes, and enables the effect of different micro-physics on macroscopic behaviour to be studied. The initial 2-D model is extended to allow three-dimensional simulations to be performed and particles of different sizes to be specified. Numerical bi-axial compression experiments under different confining pressure are used to calibrate the model. By tuning the different microscopic parameters (such as coefficient of friction, microscopic strength and distribution of grain sizes), the macroscopic strength of the material and can be adjusted to be in agreement with laboratory experiments, and the orientation of fractures is consistent with the theoretical value predicted based on Mohr-Coulomb diagram. Simulations indicate that 3-D numerical models have different macroscopic properties than in 2-D and, hence, the model must be recalibrated for 3-D simulations. These numerical experiments illustrate that the new approach is capable of simulating typical rock fracture behaviour. The new model provides a basis to investigate nucleation, rupture and slip pulse propagation in complex fault zones without the previous model limitations of a regular low-level surface geometry and being restricted to two-dimensions.
Resumo:
This trial compared the cost of an integrated home-based care model with traditional inpatient care for acute chronic obstructive pulmonary disease (COPD). 25 patients with acute COPD were randomised to either home or hospital management following request for hospital admission. The acute care at home group costs per separation ($745, CI95% $595-$895, n = 13) were significantly lower (p < 0.01) than the hospital group ($2543, CI95% $1766-$3321, n = 12). There was an improvement in lung function in the hospital-managed group at the Outpatient Department review, decreased anxiety in the Emergency Department in the home-managed group and equal patient satisfaction with care delivery. Acute care at home schemes can substitute for usual hospital care for some patients without adverse effects, and potentially release resources. A funding model that allows adequate resource delivery to the community will be needed if there is a move to devolve acute care to community providers.
Resumo:
We compare Bayesian methodology utilizing free-ware BUGS (Bayesian Inference Using Gibbs Sampling) with the traditional structural equation modelling approach based on another free-ware package, Mx. Dichotomous and ordinal (three category) twin data were simulated according to different additive genetic and common environment models for phenotypic variation. Practical issues are discussed in using Gibbs sampling as implemented by BUGS to fit subject-specific Bayesian generalized linear models, where the components of variation may be estimated directly. The simulation study (based on 2000 twin pairs) indicated that there is a consistent advantage in using the Bayesian method to detect a correct model under certain specifications of additive genetics and common environmental effects. For binary data, both methods had difficulty in detecting the correct model when the additive genetic effect was low (between 10 and 20%) or of moderate range (between 20 and 40%). Furthermore, neither method could adequately detect a correct model that included a modest common environmental effect (20%) even when the additive genetic effect was large (50%). Power was significantly improved with ordinal data for most scenarios, except for the case of low heritability under a true ACE model. We illustrate and compare both methods using data from 1239 twin pairs over the age of 50 years, who were registered with the Australian National Health and Medical Research Council Twin Registry (ATR) and presented symptoms associated with osteoarthritis occurring in joints of the hand.