936 resultados para PROPORTIONAL HAZARD AND ACCELERATED FAILURE MODELS


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: It is well known that the Amazon region presents a huge biodiversity; therefore, countless natural resources are being employed in the production of phytocosmetics and phytomedicines. Objective: The purpose of this work was to obtain emulsions produced with Buriti oil and nonionic surfactants. Methods: Two surfactant systems were employed (Steareth-2 associated to Ceteareth-5 and to Ceteareth-20) to produce the emulsions using phase diagram method. Emulsions were obtained by echo-planar imaging method at 75 degrees C. Rheological behavior and zeta potential were evaluated, and accelerated stability tests were performed. Results: All emulsions analyzed presented pseudoplastic behavior. Zeta potential values were obtained between -14.2 and -53.3 mV. The formulations did not show changes in either physical stability, pH, or rheological behavior after accelerated stability tests. Significant differences were observed only after temperature cycling test. Conclusion: Based on these results, the emulsions obtained could be considered as promising delivery systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Analytical expressions are derived for the time and magnitude of failure of an isothermal CSTR with substrate-inhibited kinetics, caused by slow catalyst deactivation under three types of parallel and series mechanisms. Reactors operating at high space velocity are found to be most susceptible to early failure and poisoning by product is more dangerous than by reactant. The magnitude of the jump across steady states depends solely on the Langmuir-Hinshelwood kinetic parameters and a detailed analysis of reactor behavior during the jump itself is given.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Except for a few large scale projects, language planners have tended to talk and argue among themselves rather than to see language policy development as an inherently political process. A comparison with a social policy example, taken from the United States, suggests that it is important to understand the problem and to develop solutions in the context of the political process, as this is where decisions will ultimately be made.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Impulsivity based on Gray's [Gray, J. A. (1982) The neuropsychology of anxiety: an enquiry into the function of the septo-hippocampal system. New York: Oxford University Press: (1991). The neurophysiology of temperament. In J. Strelau & A. Angleitner. Explorations in temperament: international perspectives on theory and measurement. London. Plenum Press]. physiological model of personality was hypothesised to be more predictive of goal oriented criteria within the workplace than scales derived From Eysenck's [Eysenck. H.J. (1967). The biological basis of personality. Springfield, IL: Charles C. Thompson.] physiological model of personality. Results confirmed the hypothesis and also showed that Gray's scale of Impulsivity was generally a better predictor than attributional style and interest in money. Results were interpreted as providing support for Gray's Behavioural Activation System which moderates response to reward. (C) 2001 Elsevier Science Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Eysenck Personality Questionnaire-Revised (EPQ-R), the Eysenck Personality Profiler Short Version (EPP-S), and the Big Five Inventory (BFI-V4a) were administered to 135 postgraduate students of business in Pakistan. Whilst Extraversion and Neuroticism scales from the three questionnaires were highly correlated, it was found that Agreeableness was most highly correlated with Psychoticism in the EPQ-R and Conscientiousness was most highly correlated with Psychoticism in the EPP-S. Principal component analyses with varimax rotation were carried out. The analyses generally suggested that the five factor model rather than the three-factor model was more robust and better for interpretation of all the higher order scales of the EPQ-R, EPP-S, and BFI-V4a in the Pakistani data. Results show that the superiority of the five factor solution results from the inclusion of a broader variety of personality scales in the input data, whereas Eysenck's three factor solution seems to be best when a less complete but possibly more important set of variables are input. (C) 2001 Elsevier Science Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We solve the Sp(N) Heisenberg and SU(N) Hubbard-Heisenberg models on the anisotropic triangular lattice in the large-N limit. These two models may describe respectively the magnetic and electronic properties of the family of layered organic materials K-(BEDT-TTF)(2)X, The Heisenberg model is also relevant to the frustrated antiferromagnet, Cs2CuCl4. We find rich phase diagrams for each model. The Sp(N) :antiferromagnet is shown to have five different phases as a function of the size of the spin and the degree of anisotropy of the triangular lattice. The effects of fluctuations at finite N are also discussed. For parameters relevant to Cs2CuCl4 the ground state either exhibits incommensurate spin order, or is in a quantum disordered phase with deconfined spin-1/2 excitations and topological order. The SU(N) Hubbard-Heisenberg model exhibits an insulating dimer phase, an insulating box phase, a semi-metallic staggered flux phase (SFP), and a metallic uniform phase. The uniform and SFP phases exhibit a pseudogap, A metal-insulator transition occurs at intermediate values of the interaction strength.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Understanding the genetic architecture of quantitative traits can greatly assist the design of strategies for their manipulation in plant-breeding programs. For a number of traits, genetic variation can be the result of segregation of a few major genes and many polygenes (minor genes). The joint segregation analysis (JSA) is a maximum-likelihood approach for fitting segregation models through the simultaneous use of phenotypic information from multiple generations. Our objective in this paper was to use computer simulation to quantify the power of the JSA method for testing the mixed-inheritance model for quantitative traits when it was applied to the six basic generations: both parents (P-1 and P-2), F-1, F-2, and both backcross generations (B-1 and B-2) derived from crossing the F-1 to each parent. A total of 1968 genetic model-experiment scenarios were considered in the simulation study to quantify the power of the method. Factors that interacted to influence the power of the JSA method to correctly detect genetic models were: (1) whether there were one or two major genes in combination with polygenes, (2) the heritability of the major genes and polygenes, (3) the level of dispersion of the major genes and polygenes between the two parents, and (4) the number of individuals examined in each generation (population size). The greatest levels of power were observed for the genetic models defined with simple inheritance; e.g., the power was greater than 90% for the one major gene model, regardless of the population size and major-gene heritability. Lower levels of power were observed for the genetic models with complex inheritance (major genes and polygenes), low heritability, small population sizes and a large dispersion of favourable genes among the two parents; e.g., the power was less than 5% for the two major-gene model with a heritability value of 0.3 and population sizes of 100 individuals. The JSA methodology was then applied to a previously studied sorghum data-set to investigate the genetic control of the putative drought resistance-trait osmotic adjustment in three crosses. The previous study concluded that there were two major genes segregating for osmotic adjustment in the three crosses. Application of the JSA method resulted in a change in the proposed genetic model. The presence of the two major genes was confirmed with the addition of an unspecified number of polygenes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Endothelial function plays a key role in the local regulation of vascular tone. Alterations in endothelial function may result in impaired release of endothelium-derived relaxing factors or increased release of endothelium-derived contracting factors. Heart failure may impair endothelial function by means of reduced synthesis and release of nitric oxide (NO) or by increased degradation of NO and increased production of endothelin-1. Endothelial dysfunction may worsen heart function by means of peripheral effects, causing increased afterload and central effects such as myocardial ischemia and inducible nitric oxide synthase (iNOS)-induced detrimental effects. Evidence from clinical studies has suggested that there is a correlation between decreased endothelial function and increasing severity of congestive heart failure (CHF). Treatments that improve heart function may also improve endothelial dysfunction. The relationship between endothelial dysfunction and heart failure may be masked by the stage of endothelial dysfunction, the location of vessels being tested, and the state of endothelial-dependent vasodilatation response.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Here we consider the role of abstract models in advancing our understanding of movement pathology. Models of movement coordination and control provide the frameworks necessary for the design and interpretation of studies of acquired and developmental disorders. These models do not however provide the resolution necessary to reveal the nature of the functional impairments that characterise specific movement pathologies. In addition, they do not provide a mapping between the structural bases of various pathologies and the associated disorders of movement. Current and prospective approaches to the study and treatment of movement disorders are discussed. It is argued that the appreciation of structure-function relationships, to which these approaches give rise, represents a challenge to current models of interlimb coordination, and a stimulus for their continued development. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We compare Bayesian methodology utilizing free-ware BUGS (Bayesian Inference Using Gibbs Sampling) with the traditional structural equation modelling approach based on another free-ware package, Mx. Dichotomous and ordinal (three category) twin data were simulated according to different additive genetic and common environment models for phenotypic variation. Practical issues are discussed in using Gibbs sampling as implemented by BUGS to fit subject-specific Bayesian generalized linear models, where the components of variation may be estimated directly. The simulation study (based on 2000 twin pairs) indicated that there is a consistent advantage in using the Bayesian method to detect a correct model under certain specifications of additive genetics and common environmental effects. For binary data, both methods had difficulty in detecting the correct model when the additive genetic effect was low (between 10 and 20%) or of moderate range (between 20 and 40%). Furthermore, neither method could adequately detect a correct model that included a modest common environmental effect (20%) even when the additive genetic effect was large (50%). Power was significantly improved with ordinal data for most scenarios, except for the case of low heritability under a true ACE model. We illustrate and compare both methods using data from 1239 twin pairs over the age of 50 years, who were registered with the Australian National Health and Medical Research Council Twin Registry (ATR) and presented symptoms associated with osteoarthritis occurring in joints of the hand.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For dynamic simulations to be credible, verification of the computer code must be an integral part of the modelling process. This two-part paper describes a novel approach to verification through program testing and debugging. In Part 1, a methodology is presented for detecting and isolating coding errors using back-to-back testing. Residuals are generated by comparing the output of two independent implementations, in response to identical inputs. The key feature of the methodology is that a specially modified observer is created using one of the implementations, so as to impose an error-dependent structure on these residuals. Each error can be associated with a fixed and known subspace, permitting errors to be isolated to specific equations in the code. It is shown that the geometric properties extend to multiple errors in either one of the two implementations. Copyright (C) 2003 John Wiley Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Part 1 of this paper a methodology for back-to-back testing of simulation software was described. Residuals with error-dependent geometric properties were generated. A set of potential coding errors was enumerated, along with a corresponding set of feature matrices, which describe the geometric properties imposed on the residuals by each of the errors. In this part of the paper, an algorithm is developed to isolate the coding errors present by analysing the residuals. A set of errors is isolated when the subspace spanned by their combined feature matrices corresponds to that of the residuals. Individual feature matrices are compared to the residuals and classified as 'definite', 'possible' or 'impossible'. The status of 'possible' errors is resolved using a dynamic subset testing algorithm. To demonstrate and validate the testing methodology presented in Part 1 and the isolation algorithm presented in Part 2, a case study is presented using a model for biological wastewater treatment. Both single and simultaneous errors that are deliberately introduced into the simulation code are correctly detected and isolated. Copyright (C) 2003 John Wiley Sons, Ltd.