971 resultados para relative utility models
Resumo:
Thc oen itninteureasc ttioo nb eo fa pcreangtmraal tiiscs uaen di ns ymnotadcetlisc ocfo nsestnrtaeinnctse processing. It is well established that object relatives (1) are harder to process than subject relatives (2). Passivization, other things being equal, increases sentence complexity. However, one of the functions of the passive construction is to promote an NP into the role of subject so that it can be more easily bound to the head NP in a higher clause. Thus, (3) is predicted to be marginally preferred over (1). Passiviazation in this instance may be seen as a way of avoiding the object relative construction. 1. The pipe that the traveller smoked annoyed the passengers. 2. The traveller that smoked the pipe annoyed the passengers. 3.The pipe that was smoked by the traveller annoyed the 4.The traveller that the pipe was smoked by annoyed the 5.The traveller that the lady was assaulted by annoyed the In (4) we have relativization of an NP which has been demoted by passivization to the status of a by-phrase. Such relative clauses may only be obtained under quite restrictive pragmatic conditions. Many languages do not permit relativization of a constituent as low as a by-phrase on the NP accessibility hierarchy (Comrie, 1984). The factors which determine the acceptability of demoted NP relatives like (4-5) reflect the ease with which the NP promoted to subject position can be taken as a discourse topic. We explored the acceptability of sentences such as (1-5) using pair-wise judgements of samddifferent meaning, accompanied by ratings of easeof understanding. Results are discussed with reference to Gibsons DLT model of linguistic complexity and sentence processing (Gibson, 2000)
Resumo:
Thomas & Tow's evaluation of the utility of human security is an important contribution to an ongoing debate about what security is and for whom security should be provided. In particular, the authors' engagement with the human security agenda is important given the centrality of this approach to recent attempts to rethink security. This article argues, however, that Thomas & Tow's approach to the human security agenda is problematic for two central reasons. First, their attempt to narrow security to make this approach amenable to state policymakers risks reifying the sources of insecurity for individuals everywhere. Second, the conception of human security they put forward appears largely inconsistent with the normative concerns inherent in the human security agenda.
Resumo:
We use published and new trace element data to identify element ratios which discriminate between arc magmas from the supra-subduction zone mantle wedge and those formed by direct melting of subducted crust (i.e. adakites). The clearest distinction is obtained with those element ratios which are strongly fractionated during refertilisation of the depleted mantle wedge, ultimately reflecting slab dehydration. Hence, adakites have significantly lower Pb/Nd and B/Be but higher Nb/Ta than typical arc magmas and continental crust as a whole. Although Li and Be are also overenriched in continental crust, behaviour of Li/Yb and Be/Nd is more complex and these ratios do not provide unique signatures of slab melting. Archaean tonalite-trondhjemite-granodiorites (TTGs) strongly resemble ordinary mantle wedge-derived arc magmas in terms of fluid-mobile trace element content, implying that they-did not form by slab melting but that they originated from mantle which was hydrated and enriched in elements lost from slabs during prograde dehydration. We suggest that Archaean TTGs formed by extensive fractional crystallisation from a mafic precursor. It is widely claimed that the time between the creation and subduction of oceanic lithosphere was significantly shorter in the Archaean (i.e. 20 Ma) than it is today. This difference was seen as an attractive explanation for the presumed preponderance of adakitic magmas during the first half of Earth's history. However, when we consider the effects of a higher potential mantle temperature on the thickness of oceanic crust, it follows that the mean age of oceanic lithosphere has remained virtually constant. Formation of adakites has therefore always depended on local plate geometry and not on potential mantle temperature.
Resumo:
Objective: To assess the (i) benefits, (ii) harms and (iii) costs of continuing mammographic screening for women 70 years and over. Data sources and synthesis: (i) We conducted a MEDLINE search (1966 - July 2000) for decision-analytic models estimating life-expectancy gains from screening in older women. The five studies meeting the inclusion criteria were critically appraised using standard criteria. We estimated relative benefit from each model's estimate of effectiveness of screening in older women relative to that in women aged 50-69 years using the same model. (ii) With data from BreastScreen Queensland, we constructed balance sheets of the consequences of screening for women in 10-year age groups (40-49 to 80-89 years), and (iii) we used a validated model to estimate the marginal cost-effectiveness of extending screening to women 70 years and over. Results: For women aged 70-79 years, the relative benefit was estimated as 40%-72%, and 18%-62% with adjustment for the impact of screening on quality of life. For women over 80 years the relative benefit was about a third, and with quality-of-life adjustment only 14%, that in women aged 50-69 years. (ii) Of 10 000 Australian women participating in ongoing screening, about 400 are recalled for further testing, and, depending on age, about 70-112 undergo biopsy and about 19-80 cancers are detected. (iii) Cost-effectiveness estimates for extending the upper age limit for mammographic screening from 69 to 79 years range from $8119 to $27 751 per quality-adjusted life-year saved, which compares favourably with extending screening to women aged 40-49 years (estimated at between $24 000 and $65 000 per life-year saved). Conclusions: Women 70 years and over, in consultation with their healthcare providers, may want to decide for themselves whether to continue mammographic screening. Decision-support materials are needed for women in this age group.
Resumo:
The rheological behaviour of nine unprocessed Australian honeys was investigated for the applicability of the Williams-Landel-Ferry (WLF) model. The viscosity of the honeys was obtained over a range of shear rates (0.01-40 s(-1)) from 2degrees to 40 degreesC, and all the honeys exhibited Newtonian behaviour with viscosity reducing as the temperature was increased. The honeys with high moisture were of lower viscosity, The glass transition temperatures of the honeys, as measured with a differential scanning calorimeter (DSC), ranged from -40degrees to -46 degreesC, and four models (WLF. Arrhenius, Vogel-Tammann-Fulcher (VTF), and power-law) were investigated to describe the temperature dependence of the viscosity. The WLF was the most suitable and the correlation coefficient averaged 0.999 +/- 0.0013 as against 0.996 +/- 0.0042 for the Arrhenius model while the mean relative deviation modulus was 0-12% for the WLF model and 10-40% for the Arrhenius one. With the universal values for the WLF constants, the temperature dependence of the viscosity was badly predicted. From non-linear regression analysis, the constants of the WLF models for the honeys were obtained (C-1 = 13.7-21.1: C-2 = 55.9-118.7) and are different from the universal values. These WLF constants will be valuable for adequate modeling of the rheology of the honeys, and they can be used to assess the temperature sensitivity of the honeys. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
Figures on the relative frequency of synthetic and composite future forms in Ouest-France are presented and compared with those of earlier studies on the passé simple and passé composé. The synthetic future is found to be dominant. Possible formal explanations for distribution are found to be inconclusive. Distribution across different text-types is found to be more promising, since contrastive functions of the two forms can be identified in texts where they co-occur. The composite future typically reports new proposals or plans as current news, while the synthetic future outlines details that will be realised at the time of implementation. Both functions are important in dailies, but current news is more often expressed in the present tense at the expense of the composite future.
Resumo:
Evaluation of the performance of the APACHE III (Acute Physiology and Chronic Health Evaluation) ICU (intensive care unit) and hospital mortality models at the Princess Alexandra Hospital, Brisbane is reported. Prospective collection of demographic, diagnostic, physiological, laboratory, admission and discharge data of 5681 consecutive eligible admissions (1 January 1995 to 1 January 2000) was conducted at the Princess Alexandra Hospital, a metropolitan Australian tertiary referral medical/surgical adult ICU. ROC (receiver operating characteristic) curve areas for the APACHE III ICU mortality and hospital mortality models demonstrated excellent discrimination. Observed ICU mortality (9.1%) was significantly overestimated by the APACHE III model adjusted for hospital characteristics (10.1%), but did not significantly differ from the prediction of the generic APACHE III model (8.6%). In contrast, observed hospital mortality (14.8%) agreed well with the prediction of the APACHE III model adjusted for hospital characteristics (14.6%), but was significantly underestimated by the unadjusted APACHE III model (13.2%). Calibration curves and goodness-of-fit analysis using Hosmer-Lemeshow statistics, demonstrated that calibration was good with the unadjusted APACHE III ICU mortality model, and the APACHE III hospital mortality model adjusted for hospital characteristics. Post hoc analysis revealed a declining annual SMR (standardized mortality rate) during the study period. This trend was present in each of the non-surgical, emergency and elective surgical diagnostic groups, and the change was temporally related to increased specialist staffing levels. This study demonstrates that the APACHE III model performs well on independent assessment in an Australian hospital. Changes observed in annual SMR using such a validated model support an hypothesis of improved survival outcomes 1995-1999.
Resumo:
It has been argued that power-law time-to-failure fits for cumulative Benioff strain and an evolution in size-frequency statistics in the lead-up to large earthquakes are evidence that the crust behaves as a Critical Point (CP) system. If so, intermediate-term earthquake prediction is possible. However, this hypothesis has not been proven. If the crust does behave as a CP system, stress correlation lengths should grow in the lead-up to large events through the action of small to moderate ruptures and drop sharply once a large event occurs. However this evolution in stress correlation lengths cannot be observed directly. Here we show, using the lattice solid model to describe discontinuous elasto-dynamic systems subjected to shear and compression, that it is for possible correlation lengths to exhibit CP-type evolution. In the case of a granular system subjected to shear, this evolution occurs in the lead-up to the largest event and is accompanied by an increasing rate of moderate-sized events and power-law acceleration of Benioff strain release. In the case of an intact sample system subjected to compression, the evolution occurs only after a mature fracture system has developed. The results support the existence of a physical mechanism for intermediate-term earthquake forecasting and suggest this mechanism is fault-system dependent. This offers an explanation of why accelerating Benioff strain release is not observed prior to all large earthquakes. The results prove the existence of an underlying evolution in discontinuous elasto-dynamic, systems which is capable of providing a basis for forecasting catastrophic failure and earthquakes.
Resumo:
We introduce a conceptual model for the in-plane physics of an earthquake fault. The model employs cellular automaton techniques to simulate tectonic loading, earthquake rupture, and strain redistribution. The impact of a hypothetical crustal elastodynamic Green's function is approximated by a long-range strain redistribution law with a r(-p) dependance. We investigate the influence of the effective elastodynamic interaction range upon the dynamical behaviour of the model by conducting experiments with different values of the exponent (p). The results indicate that this model has two distinct, stable modes of behaviour. The first mode produces a characteristic earthquake distribution with moderate to large events preceeded by an interval of time in which the rate of energy release accelerates. A correlation function analysis reveals that accelerating sequences are associated with a systematic, global evolution of strain energy correlations within the system. The second stable mode produces Gutenberg-Richter statistics, with near-linear energy release and no significant global correlation evolution. A model with effectively short-range interactions preferentially displays Gutenberg-Richter behaviour. However, models with long-range interactions appear to switch between the characteristic and GR modes. As the range of elastodynamic interactions is increased, characteristic behaviour begins to dominate GR behaviour. These models demonstrate that evolution of strain energy correlations may occur within systems with a fixed elastodynamic interaction range. Supposing that similar mode-switching dynamical behaviour occurs within earthquake faults then intermediate-term forecasting of large earthquakes may be feasible for some earthquakes but not for others, in alignment with certain empirical seismological observations. Further numerical investigation of dynamical models of this type may lead to advances in earthquake forecasting research and theoretical seismology.
Resumo:
This trial compared the cost of an integrated home-based care model with traditional inpatient care for acute chronic obstructive pulmonary disease (COPD). 25 patients with acute COPD were randomised to either home or hospital management following request for hospital admission. The acute care at home group costs per separation ($745, CI95% $595-$895, n = 13) were significantly lower (p < 0.01) than the hospital group ($2543, CI95% $1766-$3321, n = 12). There was an improvement in lung function in the hospital-managed group at the Outpatient Department review, decreased anxiety in the Emergency Department in the home-managed group and equal patient satisfaction with care delivery. Acute care at home schemes can substitute for usual hospital care for some patients without adverse effects, and potentially release resources. A funding model that allows adequate resource delivery to the community will be needed if there is a move to devolve acute care to community providers.
Resumo:
We focus on mixtures of factor analyzers from the perspective of a method for model-based density estimation from high-dimensional data, and hence for the clustering of such data. This approach enables a normal mixture model to be fitted to a sample of n data points of dimension p, where p is large relative to n. The number of free parameters is controlled through the dimension of the latent factor space. By working in this reduced space, it allows a model for each component-covariance matrix with complexity lying between that of the isotropic and full covariance structure models. We shall illustrate the use of mixtures of factor analyzers in a practical example that considers the clustering of cell lines on the basis of gene expressions from microarray experiments. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
We compare Bayesian methodology utilizing free-ware BUGS (Bayesian Inference Using Gibbs Sampling) with the traditional structural equation modelling approach based on another free-ware package, Mx. Dichotomous and ordinal (three category) twin data were simulated according to different additive genetic and common environment models for phenotypic variation. Practical issues are discussed in using Gibbs sampling as implemented by BUGS to fit subject-specific Bayesian generalized linear models, where the components of variation may be estimated directly. The simulation study (based on 2000 twin pairs) indicated that there is a consistent advantage in using the Bayesian method to detect a correct model under certain specifications of additive genetics and common environmental effects. For binary data, both methods had difficulty in detecting the correct model when the additive genetic effect was low (between 10 and 20%) or of moderate range (between 20 and 40%). Furthermore, neither method could adequately detect a correct model that included a modest common environmental effect (20%) even when the additive genetic effect was large (50%). Power was significantly improved with ordinal data for most scenarios, except for the case of low heritability under a true ACE model. We illustrate and compare both methods using data from 1239 twin pairs over the age of 50 years, who were registered with the Australian National Health and Medical Research Council Twin Registry (ATR) and presented symptoms associated with osteoarthritis occurring in joints of the hand.