973 resultados para statistical speaker models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We use published and new trace element data to identify element ratios which discriminate between arc magmas from the supra-subduction zone mantle wedge and those formed by direct melting of subducted crust (i.e. adakites). The clearest distinction is obtained with those element ratios which are strongly fractionated during refertilisation of the depleted mantle wedge, ultimately reflecting slab dehydration. Hence, adakites have significantly lower Pb/Nd and B/Be but higher Nb/Ta than typical arc magmas and continental crust as a whole. Although Li and Be are also overenriched in continental crust, behaviour of Li/Yb and Be/Nd is more complex and these ratios do not provide unique signatures of slab melting. Archaean tonalite-trondhjemite-granodiorites (TTGs) strongly resemble ordinary mantle wedge-derived arc magmas in terms of fluid-mobile trace element content, implying that they-did not form by slab melting but that they originated from mantle which was hydrated and enriched in elements lost from slabs during prograde dehydration. We suggest that Archaean TTGs formed by extensive fractional crystallisation from a mafic precursor. It is widely claimed that the time between the creation and subduction of oceanic lithosphere was significantly shorter in the Archaean (i.e. 20 Ma) than it is today. This difference was seen as an attractive explanation for the presumed preponderance of adakitic magmas during the first half of Earth's history. However, when we consider the effects of a higher potential mantle temperature on the thickness of oceanic crust, it follows that the mean age of oceanic lithosphere has remained virtually constant. Formation of adakites has therefore always depended on local plate geometry and not on potential mantle temperature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: To compare the population modelling programs NONMEM and P-PHARM during investigation of the pharmacokinetics of tacrolimus in paediatric liver-transplant recipients. Methods: Population pharmacokinetic analysis was performed using NONMEM and P-PHARM on retrospective data from 35 paediatric liver-transplant patients receiving tacrolimus therapy. The same data were presented to both programs. Maximum likelihood estimates were sought for apparent clearance (CL/F) and apparent volume of distribution (V/F). Covariates screened for influence on these parameters were weight, age, gender, post-operative day, days of tacrolimus therapy, transplant type, biliary reconstructive procedure, liver function tests, creatinine clearance, haematocrit, corticosteroid dose, and potential interacting drugs. Results: A satisfactory model was developed in both programs with a single categorical covariate - transplant type - providing stable parameter estimates and small, normally distributed (weighted) residuals. In NONMEM, the continuous covariates - age and liver function tests - improved modelling further. Mean parameter estimates were CL/F (whole liver) = 16.3 1/h, CL/F (cut-down liver) = 8.5 1/h and V/F = 565 1 in NONMEM, and CL/F = 8.3 1/h and V/F = 155 1 in P-PHARM. Individual Bayesian parameter estimates were CL/F (whole liver) = 17.9 +/- 8.8 1/h, CL/F (cutdown liver) = 11.6 +/- 18.8 1/h and V/F = 712 792 1 in NONMEM, and CL/F (whole liver) = 12.8 +/- 3.5 1/h, CL/F (cut-down liver) = 8.2 +/- 3.4 1/h and V/F = 221 1641 in P-PHARM. Marked interindividual kinetic variability (38-108%) and residual random error (approximately 3 ng/ml) were observed. P-PHARM was more user friendly and readily provided informative graphical presentation of results. NONMEM allowed a wider choice of errors for statistical modelling and coped better with complex covariate data sets. Conclusion: Results from parametric modelling programs can vary due to different algorithms employed to estimate parameters, alternative methods of covariate analysis and variations and limitations in the software itself.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An important feature of improving lattice gas models and classical isotherms is the incorporation of a pore size dependent capacity, which has hitherto been overlooked. In this paper, we develop a model for predicting the temperature dependent variation in capacity with pore size. The model is based on the analysis of a lattice gas model using a density functional theory approach at the close packed limit. Fluid-fluid and solid-fluid interactions are modeled by the Lennard-Jones 12-6 potential and Steele's 10-4-3, potential respectively. The capacity of methane in a slit-shaped carbon pore is calculated from the characteristic parameters of the unit cell, which are extracted by minimizing the grand potential of the unit cell. The capacities predicted by the proposed model are in good agreement with those obtained from grand canonical Monte Carlo simulation, for pores that can accommodate up to three adsorbed layers. Single particle and pair distributions exhibit characteristic features that correspond to the sequence of buckling and rhombic transitions that occur as the slit pore width is increased. The model provides a useful tool to model continuous variation in the microstructure of an adsorbed phase, namely buckling and rhombic transitions, with increasing pore width. (C) 2002 American Institute of Physics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Input-driven models provide an explicit and readily testable account of language learning. Although we share Ellis's view that the statistical structure of the linguistic environment is a crucial and, until recently, relatively neglected variable in language learning, we also recognize that the approach makes three assumptions about cognition and language learning that are not universally shared. The three assumptions concern (a) the language learner as an intuitive statistician, (b) the constraints on what constitute relevant surface cues, and (c) the redescription problem faced by any system that seeks to derive abstract grammatical relations from the frequency of co-occurring surface forms and functions. These are significant assumptions that must be established if input-driven models are to gain wider acceptance. We comment on these issues and briefly describe a distributed, instance-based approach that retains the key features of the input-driven account advocated by Ellis but that also addresses shortcomings of the current approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evaluation of the performance of the APACHE III (Acute Physiology and Chronic Health Evaluation) ICU (intensive care unit) and hospital mortality models at the Princess Alexandra Hospital, Brisbane is reported. Prospective collection of demographic, diagnostic, physiological, laboratory, admission and discharge data of 5681 consecutive eligible admissions (1 January 1995 to 1 January 2000) was conducted at the Princess Alexandra Hospital, a metropolitan Australian tertiary referral medical/surgical adult ICU. ROC (receiver operating characteristic) curve areas for the APACHE III ICU mortality and hospital mortality models demonstrated excellent discrimination. Observed ICU mortality (9.1%) was significantly overestimated by the APACHE III model adjusted for hospital characteristics (10.1%), but did not significantly differ from the prediction of the generic APACHE III model (8.6%). In contrast, observed hospital mortality (14.8%) agreed well with the prediction of the APACHE III model adjusted for hospital characteristics (14.6%), but was significantly underestimated by the unadjusted APACHE III model (13.2%). Calibration curves and goodness-of-fit analysis using Hosmer-Lemeshow statistics, demonstrated that calibration was good with the unadjusted APACHE III ICU mortality model, and the APACHE III hospital mortality model adjusted for hospital characteristics. Post hoc analysis revealed a declining annual SMR (standardized mortality rate) during the study period. This trend was present in each of the non-surgical, emergency and elective surgical diagnostic groups, and the change was temporally related to increased specialist staffing levels. This study demonstrates that the APACHE III model performs well on independent assessment in an Australian hospital. Changes observed in annual SMR using such a validated model support an hypothesis of improved survival outcomes 1995-1999.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It has been argued that power-law time-to-failure fits for cumulative Benioff strain and an evolution in size-frequency statistics in the lead-up to large earthquakes are evidence that the crust behaves as a Critical Point (CP) system. If so, intermediate-term earthquake prediction is possible. However, this hypothesis has not been proven. If the crust does behave as a CP system, stress correlation lengths should grow in the lead-up to large events through the action of small to moderate ruptures and drop sharply once a large event occurs. However this evolution in stress correlation lengths cannot be observed directly. Here we show, using the lattice solid model to describe discontinuous elasto-dynamic systems subjected to shear and compression, that it is for possible correlation lengths to exhibit CP-type evolution. In the case of a granular system subjected to shear, this evolution occurs in the lead-up to the largest event and is accompanied by an increasing rate of moderate-sized events and power-law acceleration of Benioff strain release. In the case of an intact sample system subjected to compression, the evolution occurs only after a mature fracture system has developed. The results support the existence of a physical mechanism for intermediate-term earthquake forecasting and suggest this mechanism is fault-system dependent. This offers an explanation of why accelerating Benioff strain release is not observed prior to all large earthquakes. The results prove the existence of an underlying evolution in discontinuous elasto-dynamic, systems which is capable of providing a basis for forecasting catastrophic failure and earthquakes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Load-Unload Response Ratio (LURR) method is an intermediate-term earthquake prediction approach that has shown considerable promise. It involves calculating the ratio of a specified energy release measure during loading and unloading where loading and unloading periods are determined from the earth tide induced perturbations in the Coulomb Failure Stress on optimally oriented faults. In the lead-up to large earthquakes, high LURR values are frequently observed a few months or years prior to the event. These signals may have a similar origin to the observed accelerating seismic moment release (AMR) prior to many large earthquakes or may be due to critical sensitivity of the crust when a large earthquake is imminent. As a first step towards studying the underlying physical mechanism for the LURR observations, numerical studies are conducted using the particle based lattice solid model (LSM) to determine whether LURR observations can be reproduced. The model is initialized as a heterogeneous 2-D block made up of random-sized particles bonded by elastic-brittle links. The system is subjected to uniaxial compression from rigid driving plates on the upper and lower edges of the model. Experiments are conducted using both strain and stress control to load the plates. A sinusoidal stress perturbation is added to the gradual compressional loading to simulate loading and unloading cycles and LURR is calculated. The results reproduce signals similar to those observed in earthquake prediction practice with a high LURR value followed by a sudden drop prior to macroscopic failure of the sample. The results suggest that LURR provides a good predictor for catastrophic failure in elastic-brittle systems and motivate further research to study the underlying physical mechanisms and statistical properties of high LURR values. The results provide encouragement for earthquake prediction research and the use of advanced simulation models to probe the physics of earthquakes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We introduce a conceptual model for the in-plane physics of an earthquake fault. The model employs cellular automaton techniques to simulate tectonic loading, earthquake rupture, and strain redistribution. The impact of a hypothetical crustal elastodynamic Green's function is approximated by a long-range strain redistribution law with a r(-p) dependance. We investigate the influence of the effective elastodynamic interaction range upon the dynamical behaviour of the model by conducting experiments with different values of the exponent (p). The results indicate that this model has two distinct, stable modes of behaviour. The first mode produces a characteristic earthquake distribution with moderate to large events preceeded by an interval of time in which the rate of energy release accelerates. A correlation function analysis reveals that accelerating sequences are associated with a systematic, global evolution of strain energy correlations within the system. The second stable mode produces Gutenberg-Richter statistics, with near-linear energy release and no significant global correlation evolution. A model with effectively short-range interactions preferentially displays Gutenberg-Richter behaviour. However, models with long-range interactions appear to switch between the characteristic and GR modes. As the range of elastodynamic interactions is increased, characteristic behaviour begins to dominate GR behaviour. These models demonstrate that evolution of strain energy correlations may occur within systems with a fixed elastodynamic interaction range. Supposing that similar mode-switching dynamical behaviour occurs within earthquake faults then intermediate-term forecasting of large earthquakes may be feasible for some earthquakes but not for others, in alignment with certain empirical seismological observations. Further numerical investigation of dynamical models of this type may lead to advances in earthquake forecasting research and theoretical seismology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This trial compared the cost of an integrated home-based care model with traditional inpatient care for acute chronic obstructive pulmonary disease (COPD). 25 patients with acute COPD were randomised to either home or hospital management following request for hospital admission. The acute care at home group costs per separation ($745, CI95% $595-$895, n = 13) were significantly lower (p < 0.01) than the hospital group ($2543, CI95% $1766-$3321, n = 12). There was an improvement in lung function in the hospital-managed group at the Outpatient Department review, decreased anxiety in the Emergency Department in the home-managed group and equal patient satisfaction with care delivery. Acute care at home schemes can substitute for usual hospital care for some patients without adverse effects, and potentially release resources. A funding model that allows adequate resource delivery to the community will be needed if there is a move to devolve acute care to community providers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We compare Bayesian methodology utilizing free-ware BUGS (Bayesian Inference Using Gibbs Sampling) with the traditional structural equation modelling approach based on another free-ware package, Mx. Dichotomous and ordinal (three category) twin data were simulated according to different additive genetic and common environment models for phenotypic variation. Practical issues are discussed in using Gibbs sampling as implemented by BUGS to fit subject-specific Bayesian generalized linear models, where the components of variation may be estimated directly. The simulation study (based on 2000 twin pairs) indicated that there is a consistent advantage in using the Bayesian method to detect a correct model under certain specifications of additive genetics and common environmental effects. For binary data, both methods had difficulty in detecting the correct model when the additive genetic effect was low (between 10 and 20%) or of moderate range (between 20 and 40%). Furthermore, neither method could adequately detect a correct model that included a modest common environmental effect (20%) even when the additive genetic effect was large (50%). Power was significantly improved with ordinal data for most scenarios, except for the case of low heritability under a true ACE model. We illustrate and compare both methods using data from 1239 twin pairs over the age of 50 years, who were registered with the Australian National Health and Medical Research Council Twin Registry (ATR) and presented symptoms associated with osteoarthritis occurring in joints of the hand.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the robustness of a range of short–term interest rate models. We examine the robustness of these models over different data sets, time periods, sampling frequencies, and estimation techniques. We examine a range of popular one–factor models that allow the conditional mean (drift) and conditional variance (diffusion) to be functions of the current short rate. We find that parameter estimates are highly sensitive to all of these factors in the eight countries that we examine. Since parameter estimates are not robust, these models should be used with caution in practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Smart State initiative requires both improved education and training, panicularly in technical fields, plus entrepreneurship to commercialise new ideas. In this study, we propose an entrepreneurial intentions model as a guide to examine the educational choices and entrepreneurial intentions of first-year University students, focusing on the effect of role models. A survey of over 1000 first-year University students revealed that the most enterprising students were choosing to study in the disciplines of information technology and business, economics and law, or selecting dual degree programs that include business. The role models most often identified for their choice of field of study were parents, followed by teachers and peers, with females identifying more role models than males. For entrepreneurship, students' role models were parents and peers, followed by famous persons and teachers. Males andfemales identified similar numbers of role models, but malesfound starting a business more desirable and more feasible, and reponed higher entrepreneurial intention. The implications of these findings for Sman State policy are discussed.