234 resultados para relative utility models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

As inorganic arsenic is a proven human carcinogen, significant effort has been made in recent decades in an attempt to understand arsenic carcinogenesis using animal models, including rodents (rats and mice) and larger mammals such as beagles and monkeys. Transgenic animals were also used to test the carcinogenic effect of arsenicals, but until recently all models had failed to mimic satisfactorily the actual mechanism of arsenic carcinogenicity. However, within the past decade successful animal models have been developed using the most common strains of mice or rats. Thus dimethylarsinic acid (DMA), an organic arsenic compound which is the major metabolite of inorganic arsenicals in mammals, has been proven to be tumorigenic in such animals. Reports of successful cancer induction in animals by inorganic arsenic (arsenite and arsenate) have been rare, and most carcinogenetic studies have used organic arsenicals such as DMA combined with other tumor initiators. Although such experiments used high concentrations. of arsenicals for the promotion of tumors, animal models using doses of arsenicals species closed to the exposure level of humans in endemic areas are obviously the most significant. Almost all researchers have used drinking water or food as the pathway for the development of animal model test systems in order to mimic chronic arsenic poisoning in humans; such pathways seem more likely to achieve desirable results. (C) 2002 Elsevier Science Ireland Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Thalamotomy has been reported to be successful in ameliorating the motor symptoms of tremor and/or rigidity in people with Parkinson's disease (PD), emphasising the bona fide contribution of this subcortical nucleus to the neural circuitry subserving motor function. Despite evidence of parallel yet segregated associative and motor cortico-subcortical-cortical circuits, comparatively few studies have investigated the effects of this procedure on cognitive functions. In particular, research pertaining to the impact of thalamotomy on linguistic processes is fundamentally lacking. Aims: The purpose of this research was to investigate the effects of thalamotomy in the language dominant and non-dominant hemispheres on linguistic functioning, relative to operative theoretical models of subcortical participation in language. This paper compares the linguistic profiles of two males with PD, aged 75 years (10 years of formal education) and 62 years (22 years of formal education), subsequent to unilateral thalamotomy procedures within the language dominant and non-dominant hemispheres, respectively. Methods & Procedures: Comprehensive linguistic profiles comprising general and high-level linguistic abilities in addition to on-line semantic processing skills were compiled up to 1 month prior to surgery and 3 months post-operatively, within perceived on'' periods (i.e., when optimally medicated). Pre- and post-operative language performances were compared within-subjects to a group of 16 non-surgical Parkinson's controls (NSPD) and a group of 16 non-neurologically impaired adults (NC). Outcomes & Results: The findings of this research suggest a laterality effect with regard to the contribution of the thalamus to high-level linguistic abilities and, potentially, the temporal processing of semantic information. This outcome supports the application of high-level linguistic assessments and measures of semantic processing proficiency to the clinical management of individuals with dominant thalamic lesions. Conclusions: The results reported lend support to contemporary theories of dominant thalamic participation in language, serving to further elucidate our current understanding of the role of subcortical structures in mediating linguistic processes, relevant to cortical hemispheric dominance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Models of plant architecture allow us to explore how genotype environment interactions effect the development of plant phenotypes. Such models generate masses of data organised in complex hierarchies. This paper presents a generic system for creating and automatically populating a relational database from data generated by the widely used L-system approach to modelling plant morphogenesis. Techniques from compiler technology are applied to generate attributes (new fields) in the database, to simplify query development for the recursively-structured branching relationship. Use of biological terminology in an interactive query builder contributes towards making the system biologist-friendly. (C) 2002 Elsevier Science Ireland Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Here we consider the role of abstract models in advancing our understanding of movement pathology. Models of movement coordination and control provide the frameworks necessary for the design and interpretation of studies of acquired and developmental disorders. These models do not however provide the resolution necessary to reveal the nature of the functional impairments that characterise specific movement pathologies. In addition, they do not provide a mapping between the structural bases of various pathologies and the associated disorders of movement. Current and prospective approaches to the study and treatment of movement disorders are discussed. It is argued that the appreciation of structure-function relationships, to which these approaches give rise, represents a challenge to current models of interlimb coordination, and a stimulus for their continued development. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thc oen itninteureasc ttioo nb eo fa pcreangtmraal tiiscs uaen di ns ymnotadcetlisc ocfo nsestnrtaeinnctse processing. It is well established that object relatives (1) are harder to process than subject relatives (2). Passivization, other things being equal, increases sentence complexity. However, one of the functions of the passive construction is to promote an NP into the role of subject so that it can be more easily bound to the head NP in a higher clause. Thus, (3) is predicted to be marginally preferred over (1). Passiviazation in this instance may be seen as a way of avoiding the object relative construction. 1. The pipe that the traveller smoked annoyed the passengers. 2. The traveller that smoked the pipe annoyed the passengers. 3.The pipe that was smoked by the traveller annoyed the 4.The traveller that the pipe was smoked by annoyed the 5.The traveller that the lady was assaulted by annoyed the In (4) we have relativization of an NP which has been demoted by passivization to the status of a by-phrase. Such relative clauses may only be obtained under quite restrictive pragmatic conditions. Many languages do not permit relativization of a constituent as low as a by-phrase on the NP accessibility hierarchy (Comrie, 1984). The factors which determine the acceptability of demoted NP relatives like (4-5) reflect the ease with which the NP promoted to subject position can be taken as a discourse topic. We explored the acceptability of sentences such as (1-5) using pair-wise judgements of samddifferent meaning, accompanied by ratings of easeof understanding. Results are discussed with reference to Gibsons DLT model of linguistic complexity and sentence processing (Gibson, 2000)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thomas & Tow's evaluation of the utility of human security is an important contribution to an ongoing debate about what security is and for whom security should be provided. In particular, the authors' engagement with the human security agenda is important given the centrality of this approach to recent attempts to rethink security. This article argues, however, that Thomas & Tow's approach to the human security agenda is problematic for two central reasons. First, their attempt to narrow security to make this approach amenable to state policymakers risks reifying the sources of insecurity for individuals everywhere. Second, the conception of human security they put forward appears largely inconsistent with the normative concerns inherent in the human security agenda.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We use published and new trace element data to identify element ratios which discriminate between arc magmas from the supra-subduction zone mantle wedge and those formed by direct melting of subducted crust (i.e. adakites). The clearest distinction is obtained with those element ratios which are strongly fractionated during refertilisation of the depleted mantle wedge, ultimately reflecting slab dehydration. Hence, adakites have significantly lower Pb/Nd and B/Be but higher Nb/Ta than typical arc magmas and continental crust as a whole. Although Li and Be are also overenriched in continental crust, behaviour of Li/Yb and Be/Nd is more complex and these ratios do not provide unique signatures of slab melting. Archaean tonalite-trondhjemite-granodiorites (TTGs) strongly resemble ordinary mantle wedge-derived arc magmas in terms of fluid-mobile trace element content, implying that they-did not form by slab melting but that they originated from mantle which was hydrated and enriched in elements lost from slabs during prograde dehydration. We suggest that Archaean TTGs formed by extensive fractional crystallisation from a mafic precursor. It is widely claimed that the time between the creation and subduction of oceanic lithosphere was significantly shorter in the Archaean (i.e. 20 Ma) than it is today. This difference was seen as an attractive explanation for the presumed preponderance of adakitic magmas during the first half of Earth's history. However, when we consider the effects of a higher potential mantle temperature on the thickness of oceanic crust, it follows that the mean age of oceanic lithosphere has remained virtually constant. Formation of adakites has therefore always depended on local plate geometry and not on potential mantle temperature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To assess the (i) benefits, (ii) harms and (iii) costs of continuing mammographic screening for women 70 years and over. Data sources and synthesis: (i) We conducted a MEDLINE search (1966 - July 2000) for decision-analytic models estimating life-expectancy gains from screening in older women. The five studies meeting the inclusion criteria were critically appraised using standard criteria. We estimated relative benefit from each model's estimate of effectiveness of screening in older women relative to that in women aged 50-69 years using the same model. (ii) With data from BreastScreen Queensland, we constructed balance sheets of the consequences of screening for women in 10-year age groups (40-49 to 80-89 years), and (iii) we used a validated model to estimate the marginal cost-effectiveness of extending screening to women 70 years and over. Results: For women aged 70-79 years, the relative benefit was estimated as 40%-72%, and 18%-62% with adjustment for the impact of screening on quality of life. For women over 80 years the relative benefit was about a third, and with quality-of-life adjustment only 14%, that in women aged 50-69 years. (ii) Of 10 000 Australian women participating in ongoing screening, about 400 are recalled for further testing, and, depending on age, about 70-112 undergo biopsy and about 19-80 cancers are detected. (iii) Cost-effectiveness estimates for extending the upper age limit for mammographic screening from 69 to 79 years range from $8119 to $27 751 per quality-adjusted life-year saved, which compares favourably with extending screening to women aged 40-49 years (estimated at between $24 000 and $65 000 per life-year saved). Conclusions: Women 70 years and over, in consultation with their healthcare providers, may want to decide for themselves whether to continue mammographic screening. Decision-support materials are needed for women in this age group.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The rheological behaviour of nine unprocessed Australian honeys was investigated for the applicability of the Williams-Landel-Ferry (WLF) model. The viscosity of the honeys was obtained over a range of shear rates (0.01-40 s(-1)) from 2degrees to 40 degreesC, and all the honeys exhibited Newtonian behaviour with viscosity reducing as the temperature was increased. The honeys with high moisture were of lower viscosity, The glass transition temperatures of the honeys, as measured with a differential scanning calorimeter (DSC), ranged from -40degrees to -46 degreesC, and four models (WLF. Arrhenius, Vogel-Tammann-Fulcher (VTF), and power-law) were investigated to describe the temperature dependence of the viscosity. The WLF was the most suitable and the correlation coefficient averaged 0.999 +/- 0.0013 as against 0.996 +/- 0.0042 for the Arrhenius model while the mean relative deviation modulus was 0-12% for the WLF model and 10-40% for the Arrhenius one. With the universal values for the WLF constants, the temperature dependence of the viscosity was badly predicted. From non-linear regression analysis, the constants of the WLF models for the honeys were obtained (C-1 = 13.7-21.1: C-2 = 55.9-118.7) and are different from the universal values. These WLF constants will be valuable for adequate modeling of the rheology of the honeys, and they can be used to assess the temperature sensitivity of the honeys. (C) 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Figures on the relative frequency of synthetic and composite future forms in Ouest-France are presented and compared with those of earlier studies on the passé simple and passé composé. The synthetic future is found to be dominant. Possible formal explanations for distribution are found to be inconclusive. Distribution across different text-types is found to be more promising, since contrastive functions of the two forms can be identified in texts where they co-occur. The composite future typically reports new proposals or plans as current news, while the synthetic future outlines details that will be realised at the time of implementation. Both functions are important in dailies, but current news is more often expressed in the present tense at the expense of the composite future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evaluation of the performance of the APACHE III (Acute Physiology and Chronic Health Evaluation) ICU (intensive care unit) and hospital mortality models at the Princess Alexandra Hospital, Brisbane is reported. Prospective collection of demographic, diagnostic, physiological, laboratory, admission and discharge data of 5681 consecutive eligible admissions (1 January 1995 to 1 January 2000) was conducted at the Princess Alexandra Hospital, a metropolitan Australian tertiary referral medical/surgical adult ICU. ROC (receiver operating characteristic) curve areas for the APACHE III ICU mortality and hospital mortality models demonstrated excellent discrimination. Observed ICU mortality (9.1%) was significantly overestimated by the APACHE III model adjusted for hospital characteristics (10.1%), but did not significantly differ from the prediction of the generic APACHE III model (8.6%). In contrast, observed hospital mortality (14.8%) agreed well with the prediction of the APACHE III model adjusted for hospital characteristics (14.6%), but was significantly underestimated by the unadjusted APACHE III model (13.2%). Calibration curves and goodness-of-fit analysis using Hosmer-Lemeshow statistics, demonstrated that calibration was good with the unadjusted APACHE III ICU mortality model, and the APACHE III hospital mortality model adjusted for hospital characteristics. Post hoc analysis revealed a declining annual SMR (standardized mortality rate) during the study period. This trend was present in each of the non-surgical, emergency and elective surgical diagnostic groups, and the change was temporally related to increased specialist staffing levels. This study demonstrates that the APACHE III model performs well on independent assessment in an Australian hospital. Changes observed in annual SMR using such a validated model support an hypothesis of improved survival outcomes 1995-1999.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It has been argued that power-law time-to-failure fits for cumulative Benioff strain and an evolution in size-frequency statistics in the lead-up to large earthquakes are evidence that the crust behaves as a Critical Point (CP) system. If so, intermediate-term earthquake prediction is possible. However, this hypothesis has not been proven. If the crust does behave as a CP system, stress correlation lengths should grow in the lead-up to large events through the action of small to moderate ruptures and drop sharply once a large event occurs. However this evolution in stress correlation lengths cannot be observed directly. Here we show, using the lattice solid model to describe discontinuous elasto-dynamic systems subjected to shear and compression, that it is for possible correlation lengths to exhibit CP-type evolution. In the case of a granular system subjected to shear, this evolution occurs in the lead-up to the largest event and is accompanied by an increasing rate of moderate-sized events and power-law acceleration of Benioff strain release. In the case of an intact sample system subjected to compression, the evolution occurs only after a mature fracture system has developed. The results support the existence of a physical mechanism for intermediate-term earthquake forecasting and suggest this mechanism is fault-system dependent. This offers an explanation of why accelerating Benioff strain release is not observed prior to all large earthquakes. The results prove the existence of an underlying evolution in discontinuous elasto-dynamic, systems which is capable of providing a basis for forecasting catastrophic failure and earthquakes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We introduce a conceptual model for the in-plane physics of an earthquake fault. The model employs cellular automaton techniques to simulate tectonic loading, earthquake rupture, and strain redistribution. The impact of a hypothetical crustal elastodynamic Green's function is approximated by a long-range strain redistribution law with a r(-p) dependance. We investigate the influence of the effective elastodynamic interaction range upon the dynamical behaviour of the model by conducting experiments with different values of the exponent (p). The results indicate that this model has two distinct, stable modes of behaviour. The first mode produces a characteristic earthquake distribution with moderate to large events preceeded by an interval of time in which the rate of energy release accelerates. A correlation function analysis reveals that accelerating sequences are associated with a systematic, global evolution of strain energy correlations within the system. The second stable mode produces Gutenberg-Richter statistics, with near-linear energy release and no significant global correlation evolution. A model with effectively short-range interactions preferentially displays Gutenberg-Richter behaviour. However, models with long-range interactions appear to switch between the characteristic and GR modes. As the range of elastodynamic interactions is increased, characteristic behaviour begins to dominate GR behaviour. These models demonstrate that evolution of strain energy correlations may occur within systems with a fixed elastodynamic interaction range. Supposing that similar mode-switching dynamical behaviour occurs within earthquake faults then intermediate-term forecasting of large earthquakes may be feasible for some earthquakes but not for others, in alignment with certain empirical seismological observations. Further numerical investigation of dynamical models of this type may lead to advances in earthquake forecasting research and theoretical seismology.