24 resultados para PROPORTIONAL HAZARD AND ACCELERATED FAILURE MODELS


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Jackson (2005) developed a hybrid model of personality and learning, known as the learning styles profiler (LSP) which was designed to span biological, socio-cognitive, and experiential research foci of personality and learning research. The hybrid model argues that functional and dysfunctional learning outcomes can be best understood in terms of how cognitions and experiences control, discipline, and re-express the biologically based scale of sensation-seeking. In two studies with part-time workers undertaking tertiary education (N=137 and 58), established models of approach and avoidance from each of the three different research foci were compared with Jackson's hybrid model in their predictiveness of leadership, work, and university outcomes using self-report and supervisor ratings. Results showed that the hybrid model was generally optimal and, as hypothesized, that goal orientation was a mediator of sensation-seeking on outcomes (work performance, university performance, leader behaviours, and counterproductive work behaviour). Our studies suggest that the hybrid model has considerable promise as a predictor of work and educational outcomes as well as dysfunctional outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hazard and operability (HAZOP) studies on chemical process plants are very time consuming, and often tedious, tasks. The requirement for HAZOP studies is that a team of experts systematically analyse every conceivable process deviation, identifying possible causes and any hazards that may result. The systematic nature of the task, and the fact that some team members may be unoccupied for much of the time, can lead to tedium, which in turn may lead to serious errors or omissions. An aid to HAZOP are fault trees, which present the system failure logic graphically such that the study team can readily assimilate their findings. Fault trees are also useful to the identification of design weaknesses, and may additionally be used to estimate the likelihood of hazardous events occurring. The one drawback of fault trees is that they are difficult to generate by hand. This is because of the sheer size and complexity of modern process plants. The work in this thesis proposed a computer-based method to aid the development of fault trees for chemical process plants. The aim is to produce concise, structured fault trees that are easy for analysts to understand. Standard plant input-output equation models for major process units are modified such that they include ancillary units and pipework. This results in a reduction in the nodes required to represent a plant. Control loops and protective systems are modelled as operators which act on process variables. This modelling maintains the functionality of loops, making fault tree generation easier and improving the structure of the fault trees produced. A method, called event ordering, is proposed which allows the magnitude of deviations of controlled or measured variables to be defined in terms of the control loops and protective systems with which they are associated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

How are innovative new business models established if organizations constantly compare themselves against existing criteria and expectations? The objective is to address this question from the perspective of innovators and their ability to redefine established expectations and evaluation criteria. The research questions ask whether there are discernible patterns of discursive action through which innovators theorize institutional change and what role such theorizations play for mobilizing support and realizing change projects. These questions are investigated through a case study on a critical area of enterprise computing software, Java application servers. In the present case, business practices and models were already well established among incumbents with critical market areas allocated to few dominant firms. Fringe players started experimenting with a new business approach of selling services around freely available opensource application servers. While most new players struggled, one new entrant succeeded in leading incumbents to adopt and compete on the new model. The case demonstrates that innovative and substantially new models and practices are established in organizational fields when innovators are able to refine expectations and evaluation criteria within an organisational field. The study addresses the theoretical paradox of embedded agency. Actors who are embedded in prevailing institutional logics and structures find it hard to perceive potentially disruptive opportunities that fall outside existing ways of doing things. Changing prevailing institutional logics and structures requires strategic and institutional work aimed at overcoming barriers to innovation. The study addresses this problem through the lens of (new) institutional theory. This discourse methodology traces the process through which innovators were able to establish a new social and business model in the field.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Common approaches to IP-traffic modelling have featured the use of stochastic models, based on the Markov property, which can be classified into black box and white box models based on the approach used for modelling traffic. White box models, are simple to understand, transparent and have a physical meaning attributed to each of the associated parameters. To exploit this key advantage, this thesis explores the use of simple classic continuous-time Markov models based on a white box approach, to model, not only the network traffic statistics but also the source behaviour with respect to the network and application. The thesis is divided into two parts: The first part focuses on the use of simple Markov and Semi-Markov traffic models, starting from the simplest two-state model moving upwards to n-state models with Poisson and non-Poisson statistics. The thesis then introduces the convenient to use, mathematically derived, Gaussian Markov models which are used to model the measured network IP traffic statistics. As one of the most significant contributions, the thesis establishes the significance of the second-order density statistics as it reveals that, in contrast to first-order density, they carry much more unique information on traffic sources and behaviour. The thesis then exploits the use of Gaussian Markov models to model these unique features and finally shows how the use of simple classic Markov models coupled with use of second-order density statistics provides an excellent tool for capturing maximum traffic detail, which in itself is the essence of good traffic modelling. The second part of the thesis, studies the ON-OFF characteristics of VoIP traffic with reference to accurate measurements of the ON and OFF periods, made from a large multi-lingual database of over 100 hours worth of VoIP call recordings. The impact of the language, prosodic structure and speech rate of the speaker on the statistics of the ON-OFF periods is analysed and relevant conclusions are presented. Finally, an ON-OFF VoIP source model with log-normal transitions is contributed as an ideal candidate to model VoIP traffic and the results of this model are compared with those of previously published work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bovine tuberculosis (bTB) caused by infection with Mycobacterium bovis is causing considerable economic loss to farmers and Government in the United Kingdom as its incidence is increasing. Efforts to control bTB in the UK are hampered by the infection in Eurasian badgers (Metes metes) that represent a wildlife reservoir and source of recurrent M. bovis exposure to cattle. Vaccination of badgers with the human TB vaccine, M. bovis Bacille Calmette-Guerin (BCG), in oral bait represents a possible disease control tool and holds the best prospect for reaching badger populations over a wide geographical area. Using mouse and guinea pig models, we evaluated the immunogenicity and protective efficacy, respectively, of candidate badger oral vaccines based on formulation of BCG in lipid matrix, alginate beads, or a novel microcapsular hybrid of both lipid and alginate. Two different oral doses of BCG were evaluated in each formulation for their protective efficacy in guinea pigs, while a single dose was evaluated in mice. In mice, significant immune responses (based on lymphocyte proliferation and expression of IFN-gamma) were only seen with the lipid matrix and the lipid in alginate microcapsular formulation, corresponding to the isolation of viable BCG from alimentary tract lymph nodes. In guinea pigs, only BCG formulated in lipid matrix conferred protection to the spleen and lungs following aerosol route challenge with M. bovis. Protection was seen with delivery doses in the range 10(6)-10(7) CFU, although this was more consistent in the spleen at the higher dose. No protection in terms of organ CFU was seen with BCG administered in alginate beads or in lipid in alginate microcapsules, although 10(7) in the latter formulation conferred protection in terms of increasing body weight after challenge and a smaller lung to body weight ratio at necropsy. These results highlight the potential for lipid, rather than alginate, -based vaccine formulations as suitable delivery vehicles for an oral BCG vaccine in badgers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modelling architectural information is particularly important because of the acknowledged crucial role of software architecture in raising the level of abstraction during development. In the MDE area, the level of abstraction of models has frequently been related to low-level design concepts. However, model-driven techniques can be further exploited to model software artefacts that take into account the architecture of the system and its changes according to variations of the environment. In this paper, we propose model-driven techniques and dynamic variability as concepts useful for modelling the dynamic fluctuation of the environment and its impact on the architecture. Using the mappings from the models to implementation, generative techniques allow the (semi) automatic generation of artefacts making the process more efficient and promoting software reuse. The automatic generation of configurations and reconfigurations from models provides the basis for safer execution. The architectural perspective offered by the models shift focus away from implementation details to the whole view of the system and its runtime change promoting high-level analysis. © 2009 Springer Berlin Heidelberg.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Factors associated with duration of dementia in a consecutive series of 103 Alzheimer's disease (AD) cases were studied using the Kaplan-Meier estimator and Cox regression analysis (proportional hazard model). Mean disease duration was 7.1 years (range: 6 weeks-30 years, standard deviation = 5.18); 25% of cases died within four years, 50% within 6.9 years, and 75% within 10 years. Familial AD cases (FAD) had a longer duration than sporadic cases (SAD), especially cases linked to presenilin (PSEN) genes. No significant differences in duration were associated with age, sex, or apolipoprotein E (Apo E) genotype. Duration was reduced in cases with arterial hypertension. Cox regression analysis suggested longer duration was associated with an earlier disease onset and increased senile plaque (SP) and neurofibrillary tangle (NFT) pathology in the orbital gyrus (OrG), CA1 sector of the hippocampus, and nucleus basalis of Meynert (NBM). The data suggest shorter disease duration in SAD and in cases with hypertensive comorbidity. In addition, degree of neuropathology did not influence survival, but spread of SP/NFT pathology into the frontal lobe, hippocampus, and basal forebrain was associated with longer disease duration. © 2014 R. A. Armstrong.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Markers of increased oxidative stress are known to be elevated following acute myocardial infarction and in the context of chronic left ventricular hypertrophy or heart failure, and their levels may correlate with the degree of contractile dysfunction or cardiac deficit. An obvious pathological mechanism that may account for this correlation is the potential deleterious effects of increased oxidative stress through the induction of cellular dysfunction, energetic deficit or cell death. However, reactive oxygen species have several much more subtle effects in the remodelling or failing heart that involve specific redox-regulated modulation of signalling pathways and gene expression. Such redox-sensitive regulation appears to play important roles in the development of several components of the phenotype of the failing heart, for example cardiomyocyte hypertrophy, interstitial fibrosis and chamber remodelling. In this article, we review the evidence supporting the involvement of reactive oxygen species and redox signalling pathways in the development of cardiac hypertrophy and heart failure, with a particular focus on the NADPH oxidase family of superoxide-generating enzymes which appear to be especially important in redox signalling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A combination of the two-fluid and drift flux models have been used to model the transport of fibrous debris. This debris is generated during loss of coolant accidents in the primary circuit of pressurized or boiling water nuclear reactors, as high pressure steam or water jets can damage adjacent insulation materials including mineral wool blankets. Fibre agglomerates released from the mineral wools may reach the containment sump strainers, where they can accumulate and compromise the long-term operation of the emergency core cooling system. Single-effect experiments of sedimentation in a quiescent rectangular column and sedimentation in a horizontal flow are used to verify and validate this particular application of the multiphase numerical models. The utilization of both modeling approaches allows a number of pseudocontinuous dispersed phases of spherical wetted agglomerates to be modeled simultaneously. Key effects on the transport of the fibre agglomerates are particle size, density and turbulent dispersion, as well as the relative viscosity of the fluid-fibre mixture.