935 resultados para Multiple Organ Failure
Resumo:
Anisotropic damage distribution and evolution have a profound effect on borehole stress concentrations. Damage evolution is an irreversible process that is not adequately described within classical equilibrium thermodynamics. Therefore, we propose a constitutive model, based on non-equilibrium thermodynamics, that accounts for anisotropic damage distribution, anisotropic damage threshold and anisotropic damage evolution. We implemented this constitutive model numerically, using the finite element method, to calculate stress–strain curves and borehole stresses. The resulting stress–strain curves are distinctively different from linear elastic-brittle and linear elastic-ideal plastic constitutive models and realistically model experimental responses of brittle rocks. We show that the onset of damage evolution leads to an inhomogeneous redistribution of material properties and stresses along the borehole wall. The classical linear elastic-brittle approach to borehole stability analysis systematically overestimates the stress concentrations on the borehole wall, because dissipative strain-softening is underestimated. The proposed damage mechanics approach explicitly models dissipative behaviour and leads to non-conservative mud window estimations. Furthermore, anisotropic rocks with preferential planes of failure, like shales, can be addressed with our model.
Resumo:
Vacuum circuit breaker (VCB) overvoltage failure and its catastrophic failures during shunt reactor switching have been analyzed through computer simulations for multiple reignitions with a statistical VCB model found in the literature. However, a systematic review (SR) that is related to the multiple reignitions with a statistical VCB model does not yet exist. Therefore, this paper aims to analyze and explore the multiple reignitions with a statistical VCB model. It examines the salient points, research gaps and limitations of the multiple reignition phenomenon to assist with future investigations following the SR search. Based on the SR results, seven issues and two approaches to enhance the current statistical VCB model are identified. These results will be useful as an input to improve the computer modeling accuracy as well as the development of a reignition switch model with point-on-wave controlled switching for condition monitoring
Resumo:
BACKGROUND: Infection by dengue virus (DENV) is a major public health concern in hundreds of tropical and subtropical countries. French Polynesia (FP) regularly experiences epidemics that initiate, or are consecutive to, DENV circulation in other South Pacific Island Countries (SPICs). In January 2009, after a decade of serotype 1 (DENV-1) circulation, the first cases of DENV-4 infection were reported in FP. Two months later a new epidemic emerged, occurring about 20 years after the previous circulation of DENV-4 in FP. In this study, we investigated the epidemiological and molecular characteristics of the introduction, spread and genetic microevolution of DENV-4 in FP. METHODOLOGY/PRINCIPAL FINDINGS: Epidemiological data suggested that recent transmission of DENV-4 in FP started in the Leeward Islands and this serotype quickly displaced DENV-1 throughout FP. Phylogenetic analyses of the nucleotide sequences of the envelope (E) gene of 64 DENV-4 strains collected in FP in the 1980s and in 2009-2010, and some additional strains from other SPICs showed that DENV-4 strains from the SPICs were distributed into genotypes IIa and IIb. Recent FP strains were distributed into two clusters, each comprising viruses from other but distinct SPICs, suggesting that emergence of DENV-4 in FP in 2009 resulted from multiple introductions. Otherwise, we observed that almost all strains collected in the SPICs in the 1980s exhibit an amino acid (aa) substitution V287I within domain I of the E protein, and all recent South Pacific strains exhibit a T365I substitution within domain III. CONCLUSIONS/SIGNIFICANCE: This study confirmed the cyclic re-emergence and displacement of DENV serotypes in FP. Otherwise, our results showed that specific aa substitutions on the E protein were present on all DENV-4 strains circulating in SPICs. These substitutions probably acquired and subsequently conserved could reflect a founder effect to be associated with epidemiological, geographical, eco-biological and social specificities in SPICs.
Resumo:
Mapping Multiple Literacies brings together the latest theory and research in the fields of literacy study and European philosophy, Multiple Literacies Theory (MLT) and the philosophical work of Gilles Deleuze. It frames the process of becoming literate as a fluid process involving multiple modes of presentation, and explains these processes in terms of making maps of our social lives and ways of doing things together. For Deleuze, language acquisition is a social activity of which we are a part, but only one part amongst many others. Masny and Cole draw on Deleuze's thinking to expand the repertoires of literacy research and understanding. They outline how we can understand literacy as a social activity and map the ways in which becoming literate may take hold and transform communities. The chapters in this book weave together theory, data and practice to open up a creative new area of literacy studies and to provoke vigorous debate about the sociology of literacy.
Resumo:
This thematic issue on education and the politics of becoming focuses on how a Multiple Literacies Theory (MLT) plugs into practice in education. MLT does this by creating an assemblage between discourse, text, resonance and sensations. What does this produce? Becoming AND how one might live are the product of an assemblage (May, 2005; Semetsky, 2003). In this paper, MLT is the approach that explores the connection between educational theory and practice through the lens of an empirical study of multilingual children acquiring multiple writing systems simultaneously. The introduction explicates discourse, text, resonance, sensation and becoming. The second section introduces certain Deleuzian concepts that plug into MLT. The third section serves as an introduction to MLT. The fourth section is devoted to the study by way of a rhizoanalysis. Finally, drawing on the concept of the rhizome, this article exits with potential lines of flight opened by MLT. These are becomings which highlight the significance of this work in terms of transforming not only how literacies are conceptualized, especially in minority language contexts, but also how one might live.
Resumo:
Ambiguity resolution plays a crucial role in real time kinematic GNSS positioning which gives centimetre precision positioning results if all the ambiguities in each epoch are correctly fixed to integers. However, the incorrectly fixed ambiguities can result in large positioning offset up to several meters without notice. Hence, ambiguity validation is essential to control the ambiguity resolution quality. Currently, the most popular ambiguity validation is ratio test. The criterion of ratio test is often empirically determined. Empirically determined criterion can be dangerous, because a fixed criterion cannot fit all scenarios and does not directly control the ambiguity resolution risk. In practice, depending on the underlying model strength, the ratio test criterion can be too conservative for some model and becomes too risky for others. A more rational test method is to determine the criterion according to the underlying model and user requirement. Miss-detected incorrect integers will lead to a hazardous result, which should be strictly controlled. In ambiguity resolution miss-detected rate is often known as failure rate. In this paper, a fixed failure rate ratio test method is presented and applied in analysis of GPS and Compass positioning scenarios. A fixed failure rate approach is derived from the integer aperture estimation theory, which is theoretically rigorous. The criteria table for ratio test is computed based on extensive data simulations in the approach. The real-time users can determine the ratio test criterion by looking up the criteria table. This method has been applied in medium distance GPS ambiguity resolution but multi-constellation and high dimensional scenarios haven't been discussed so far. In this paper, a general ambiguity validation model is derived based on hypothesis test theory, and fixed failure rate approach is introduced, especially the relationship between ratio test threshold and failure rate is examined. In the last, Factors that influence fixed failure rate approach ratio test threshold is discussed according to extensive data simulation. The result shows that fixed failure rate approach is a more reasonable ambiguity validation method with proper stochastic model.
Resumo:
Global Navigation Satellite Systems (GNSS)-based observation systems can provide high precision positioning and navigation solutions in real time, in the order of subcentimetre if we make use of carrier phase measurements in the differential mode and deal with all the bias and noise terms well. However, these carrier phase measurements are ambiguous due to unknown, integer numbers of cycles. One key challenge in the differential carrier phase mode is to fix the integer ambiguities correctly. On the other hand, in the safety of life or liability-critical applications, such as for vehicle safety positioning and aviation, not only is high accuracy required, but also the reliability requirement is important. This PhD research studies to achieve high reliability for ambiguity resolution (AR) in a multi-GNSS environment. GNSS ambiguity estimation and validation problems are the focus of the research effort. Particularly, we study the case of multiple constellations that include initial to full operations of foreseeable Galileo, GLONASS and Compass and QZSS navigation systems from next few years to the end of the decade. Since real observation data is only available from GPS and GLONASS systems, the simulation method named Virtual Galileo Constellation (VGC) is applied to generate observational data from another constellation in the data analysis. In addition, both full ambiguity resolution (FAR) and partial ambiguity resolution (PAR) algorithms are used in processing single and dual constellation data. Firstly, a brief overview of related work on AR methods and reliability theory is given. Next, a modified inverse integer Cholesky decorrelation method and its performance on AR are presented. Subsequently, a new measure of decorrelation performance called orthogonality defect is introduced and compared with other measures. Furthermore, a new AR scheme considering the ambiguity validation requirement in the control of the search space size is proposed to improve the search efficiency. With respect to the reliability of AR, we also discuss the computation of the ambiguity success rate (ASR) and confirm that the success rate computed with the integer bootstrapping method is quite a sharp approximation to the actual integer least-squares (ILS) method success rate. The advantages of multi-GNSS constellations are examined in terms of the PAR technique involving the predefined ASR. Finally, a novel satellite selection algorithm for reliable ambiguity resolution called SARA is developed. In summary, the study demonstrats that when the ASR is close to one, the reliability of AR can be guaranteed and the ambiguity validation is effective. The work then focuses on new strategies to improve the ASR, including a partial ambiguity resolution procedure with a predefined success rate and a novel satellite selection strategy with a high success rate. The proposed strategies bring significant benefits of multi-GNSS signals to real-time high precision and high reliability positioning services.
Resumo:
A procedure for the evaluation of multiple scattering contributions is described, for deep inelastic neutron scattering (DINS) studies using an inverse geometry time-of-flight spectrometer. The accuracy of a Monte Carlo code DINSMS, used to calculate the multiple scattering, is tested by comparison with analytic expressions and with experimental data collected from polythene, polycrystalline graphite and tin samples. It is shown that the Monte Carlo code gives an accurate representation of the measured data and can therefore be used to reliably correct DINS data.
Resumo:
Welcome to the Quality assessment matrix. This matrix is designed for highly qualified discipline experts to evaluate their course, major or unit in a systematic manner. The primary purpose of the Quality assessment matrix is to provide a tool that a group of academic staff at universities can collaboratively review the assessment within a course, major or unit annually. The annual review will result in you being read for an external curricula review at any point in time. This tool is designed for use in a workshop format with one, two or more academic staff, and will lead to an action plan for implementation.
Resumo:
Background: Heart failure is a serious condition estimated to affect 1.5-2.0% of the Australian population with a point prevalence of approximately 1% in people aged 50-59 years, 10% in people aged 65 years or more and over 50% in people aged 85 years or over (National Heart Foundation of Australian and the Cardiac Society of Australia and New Zealand, 2006). Sleep disturbances are a common complaint of persons with heart failure. Disturbances of sleep can worsen heart failure symptoms, impair independence, reduce quality of life and lead to increased health care utilisation in patients with heart failure. Previous studies have identified exercise as a possible treatment for poor sleep in patients without cardiac disease however there is limited evidence of the effect of this form of treatment in heart failure. Aim: The primary objective of this study was to examine the effect of a supervised, hospital-based exercise training programme on subjective sleep quality in heart failure patients. Secondary objectives were to examine the association between changes in sleep quality and changes in depression, exercise performance and body mass index. Methods: The sample for the study was recruited from metropolitan and regional heart failure services across Brisbane, Queensland. Patients with a recent heart failure related hospital admission who met study inclusion criteria were recruited. Participants were screened by specialist heart failure exercise staff at each site to ensure exercise safety prior to study enrolment. Demographic data, medical history, medications, Pittsburgh Sleep Quality Index score, Geriatric Depression Score, exercise performance (six minute walk test), weight and height were collected at Baseline. Pittsburgh Sleep Quality Index score, Geriatric Depression Score, exercise performance and weight were repeated at 3 months. One hundred and six patients admitted to hospital with heart failure were randomly allocated to a 3-month disease-based management programme of education and self-management support including standard exercise advice (Control) or to the same disease management programme as the Control group with the addition of a tailored physical activity program (Intervention). The intervention consisted of 1 hour of aerobic and resistance exercise twice a week. Programs were designed and supervised by an exercise specialist. The main outcome measure was achievement of a clinically significant change (.3 points) in global Pittsburgh Sleep Quality score. Results: Intervention group participants reported significantly greater clinical improvement in global sleep quality than Control (p=0.016). These patients also exhibited significant improvements in component sleep disturbance (p=0.004), component sleep quality (p=0.015) and global sleep quality (p=0.032) after 3 months of supervised exercise intervention. Improvements in sleep quality correlated with improvements in depression (p<0.001) and six minute walk distance (p=0.04). When study results were examined categorically, with subjects classified as either "poor" or "good" sleepers, subjects in the Control group were significantly more likely to report "poor" sleep at 3 months (p=0.039) while Intervention participants were likely to report "good" sleep at this time (p=0.08). Conclusion: Three months of supervised, hospital based, aerobic and resistance exercise training improved subjective sleep quality in patients with heart failure. This is the first randomised controlled trial to examine the role of aerobic and resistance exercise training in the improvement of sleep quality for patients with this disease. While this study establishes exercise as a therapy for poor sleep quality, further research is needed to investigate the effect of exercise training on objective parameters of sleep in this population.
Resumo:
With a monolayer honeycomb-lattice of sp2-hybridized carbon atoms, graphene has demonstrated exceptional electrical, mechanical and thermal properties. One of its promising applications is to create graphene-polymer nanocomposites with tailored mechanical and physical properties. In general, the mechanical properties of graphene nanofiller as well as graphene-polymer interface govern the overall mechanical performance of graphene-polymer nanocomposites. However, the strengthening and toughening mechanisms in these novel nanocomposites have not been well understood. In this work, the deformation and failure of graphene sheet and graphene-polymer interface were investigated using molecular dynamics (MD) simulations. The effect of structural defects on the mechanical properties of graphene and graphene-polymer interface was investigated as well. The results showed that structural defects in graphene (e.g. Stone-Wales defect and multi-vacancy defect) can significantly deteriorate the fracture strength of graphene but may still make full utilization of corresponding strength of graphene and keep the interfacial strength and the overall mechanical performance of graphene-polymer nanocomposites.
Resumo:
Remote monitoring for heart failure has been evaluated in numerous systematic reviews. The aim of this meta-review was to appraise their quality and synthesise results. We electronically searched online databases, performed a forward citation search and hand-searched bibliographies. Systematic reviews of remote monitoring interventions that were used for surveillance of heart failure patients were included. Seven (41%) systematic reviews pooled results for meta-analysis. Eight (47%) considered all non-invasive remote monitoring strategies. Five (29%) focused on telemonitoring. Four (24%) included both non-invasive and invasive technologies. According to AMSTAR criteria, ten (58%) systematic reviews were of poor methodological quality. In high quality reviews, the relative risk of mortality in patients who received remote monitoring ranged from 0.53 (95% CI=0.29-0.96) to 0.88 (95% CI=0.76-1.01). High quality reviews also reported that remote monitoring reduced the relative risk of all-cause (0.52; 95% CI=0.28-0.96 to 0.96; 95% CI=0.90–1.03) and heart failure-related hospitalizations (0.72; 95% CI=0.64–0.81 to RR 0.79; 95% CI=0.67-0.94) and, as a consequence, healthcare costs. As the high quality reviews reported that remote monitoring reduced hospitalizations, mortality and healthcare costs, research efforts should now be directed towards optimising these interventions in preparation for more widespread implementation.
Resumo:
Background/aims: Remote monitoring for heart failure has not only been evaluated in a large number of randomised controlled trials, but also in many systematic reviews and meta-analyses. The aim of this meta-review was to identify, appraise and synthesise existing systematic reviews that have evaluated the effects of remote monitoring in heart failure. Methods: Using a Cochrane methodology, we electronically searched all relevant online databases and search engines, performed a forward citation search as well as hand-searched bibliographies. Only fully published systematic reviews of invasive and/or non-invasive remote monitoring interventions were included. Two reviewers independently extracted data. Results: Sixty-five publications from 3333 citations were identified. Seventeen fulfilled the inclusion and exclusion criteria. Quality varied with A Measurement Tool to Assess Systematic Reviews (AMSTAR scores) ranging from 2 to 11 (mean 5.88). Seven reviews (41%) pooled results from individual studies for meta-analysis. Eight (47%) considered all non-invasive remote monitoring strategies. Four (24%) focused specifically on telemonitoring. Four (24%) included studies investigating both non-invasive and invasive technologies. Population characteristics of the included studies were not reported consistently. Mortality and hospitalisations were the most frequently reported outcomes 12 (70%). Only five reviews (29%) reported healthcare costs and compliance. A high degree of heterogeneity was reported in many of the meta-analyses. Conclusions: These results should be considered in context of two negative RCTs of remote monitoring for heart failure that have been published since the meta-analyses (TIM-HF and Tele-HF). However, high quality reviews demonstrated improved mortality, quality of life, reduction in hospitalisations and healthcare costs.
Resumo:
The emergence of pseudo-marginal algorithms has led to improved computational efficiency for dealing with complex Bayesian models with latent variables. Here an unbiased estimator of the likelihood replaces the true likelihood in order to produce a Bayesian algorithm that remains on the marginal space of the model parameter (with latent variables integrated out), with a target distribution that is still the correct posterior distribution. Very efficient proposal distributions can be developed on the marginal space relative to the joint space of model parameter and latent variables. Thus psuedo-marginal algorithms tend to have substantially better mixing properties. However, for pseudo-marginal approaches to perform well, the likelihood has to be estimated rather precisely. This can be difficult to achieve in complex applications. In this paper we propose to take advantage of multiple central processing units (CPUs), that are readily available on most standard desktop computers. Here the likelihood is estimated independently on the multiple CPUs, with the ultimate estimate of the likelihood being the average of the estimates obtained from the multiple CPUs. The estimate remains unbiased, but the variability is reduced. We compare and contrast two different technologies that allow the implementation of this idea, both of which require a negligible amount of extra programming effort. The superior performance of this idea over the standard approach is demonstrated on simulated data from a stochastic volatility model.
Resumo:
The reliability analysis is crucial to reducing unexpected down time, severe failures and ever tightened maintenance budget of engineering assets. Hazard based reliability methods are of particular interest as hazard reflects the current health status of engineering assets and their imminent failure risks. Most existing hazard models were constructed using the statistical methods. However, these methods were established largely based on two assumptions: one is the assumption of baseline failure distributions being accurate to the population concerned and the other is the assumption of effects of covariates on hazards. These two assumptions may be difficult to achieve and therefore compromise the effectiveness of hazard models in the application. To address this issue, a non-linear hazard modelling approach is developed in this research using neural networks (NNs), resulting in neural network hazard models (NNHMs), to deal with limitations due to the two assumptions for statistical models. With the success of failure prevention effort, less failure history becomes available for reliability analysis. Involving condition data or covariates is a natural solution to this challenge. A critical issue for involving covariates in reliability analysis is that complete and consistent covariate data are often unavailable in reality due to inconsistent measuring frequencies of multiple covariates, sensor failure, and sparse intrusive measurements. This problem has not been studied adequately in current reliability applications. This research thus investigates such incomplete covariates problem in reliability analysis. Typical approaches to handling incomplete covariates have been studied to investigate their performance and effects on the reliability analysis results. Since these existing approaches could underestimate the variance in regressions and introduce extra uncertainties to reliability analysis, the developed NNHMs are extended to include handling incomplete covariates as an integral part. The extended versions of NNHMs have been validated using simulated bearing data and real data from a liquefied natural gas pump. The results demonstrate the new approach outperforms the typical incomplete covariates handling approaches. Another problem in reliability analysis is that future covariates of engineering assets are generally unavailable. In existing practices for multi-step reliability analysis, historical covariates were used to estimate the future covariates. Covariates of engineering assets, however, are often subject to substantial fluctuation due to the influence of both engineering degradation and changes in environmental settings. The commonly used covariate extrapolation methods thus would not be suitable because of the error accumulation and uncertainty propagation. To overcome this difficulty, instead of directly extrapolating covariate values, projection of covariate states is conducted in this research. The estimated covariate states and unknown covariate values in future running steps of assets constitute an incomplete covariate set which is then analysed by the extended NNHMs. A new assessment function is also proposed to evaluate risks of underestimated and overestimated reliability analysis results. A case study using field data from a paper and pulp mill has been conducted and it demonstrates that this new multi-step reliability analysis procedure is able to generate more accurate analysis results.