935 resultados para Multiple Organ Failure
Resumo:
Ambiguity validation as an important procedure of integer ambiguity resolution is to test the correctness of the fixed integer ambiguity of phase measurements before being used for positioning computation. Most existing investigations on ambiguity validation focus on test statistic. How to determine the threshold more reasonably is less understood, although it is one of the most important topics in ambiguity validation. Currently, there are two threshold determination methods in the ambiguity validation procedure: the empirical approach and the fixed failure rate (FF-) approach. The empirical approach is simple but lacks of theoretical basis. The fixed failure rate approach has a rigorous probability theory basis, but it employs a more complicated procedure. This paper focuses on how to determine the threshold easily and reasonably. Both FF-ratio test and FF-difference test are investigated in this research and the extensive simulation results show that the FF-difference test can achieve comparable or even better performance than the well-known FF-ratio test. Another benefit of adopting the FF-difference test is that its threshold can be expressed as a function of integer least-squares (ILS) success rate with specified failure rate tolerance. Thus, a new threshold determination method named threshold function for the FF-difference test is proposed. The threshold function method preserves the fixed failure rate characteristic and is also easy-to-apply. The performance of the threshold function is validated with simulated data. The validation results show that with the threshold function method, the impact of the modelling error on the failure rate is less than 0.08%. Overall, the threshold function for the FF-difference test is a very promising threshold validation method and it makes the FF-approach applicable for the real-time GNSS positioning applications.
Resumo:
This paper explores the concept that individual dancers leave traces in a choreographer’s body of work and similarly, that dancers carry forward residue of embodied choreographies into other working processes. This presentation will be grounded in a study of the multiple iterations of a programme of solo works commissioned in 2008 from choreographers John Jasperse, Jodi Melnick, Liz Roche and Rosemary Butcher and danced by the author. This includes an exploration of the development by John Jasperse of themes from his solo into the pieces PURE (2008) and Truth, Revised Histories, Wishful Thinking and Flat Out Lies (2009); an adaptation of the solo Business of the Bloom by Jodi Melnick in 2008 and a further adaptation of Business of the Bloom by this author in 2012. It will map some of the developments that occurred through a number of further performances over five years of the solo Shared Material on Dying by Liz Roche and the working process of the (uncompleted) solo Episodes of Flight by Rosemary Butcher. The purpose is to reflect back on authorship in dance, an art form in which lineages of influence can often be clearly observed. Normally, once a choreographic work is created and performed, it is archived through video recording, notation and/or reviews. The dancer is no longer called upon to represent the dance piece within the archive and thus her/his lived presence and experiential perspective disappears. The author will draw on the different traces still inhabiting her body as pathways towards understanding how choreographic movement circulates beyond this moment of performance. This will include the interrogation of ownership of choreographic movement, as once it becomes integrated in the body of the dancer, who owns the dance? Furthermore, certain dancers, through their individual physical characteristics and moving identities, can deeply influence the formation of choreographic signatures, a proposition that challenges the sole authorship role of the choreographer in dance production. This paper will be delivered in a presentation format that will bleed into movement demonstrations alongside video footage of the works and auto-ethnographic accounts of dancing experience. A further source of knowledge will be drawn from extracts of interviews with other dancers including Sara Rudner, Rebecca Hilton and Catherine Bennett.
Resumo:
Strain-based failure criteria have several advantages over stress-based failure criteria: they can account for elastic and inelastic strains, they utilise direct, observables effects instead of inferred effects (strain gauges vs. stress estimates), and model complete stress-strain curves including pre-peak, non-linear elasticity and post-peak strain weakening. In this study, a strain-based failure criterion derived from thermodynamic first principles utilising the concepts of continuum damage mechanics is presented. Furthermore, implementation of this failure criterion into a finite-element simulation is demonstrated and applied to the stability of underground mining coal pillars. In numerical studies, pillar strength is usually expressed in terms of critical stresses or stress-based failure criteria where scaling with pillar width and height is common. Previous publications have employed the finite-element method for pillar stability analysis using stress-based failure criterion such as Mohr-Coulomb and Hoek-Brown or stress-based scalar damage models. A novel constitutive material model, which takes into consideration anisotropy as well as elastic strain and damage as state variables has been developed and is presented in this paper. The damage threshold and its evolution are strain-controlled, and coupling of the state variables is achieved through the damage-induced degradation of the elasticity tensor. This material model is implemented into the finite-element software ABAQUS and can be applied to 3D problems. Initial results show that this new material model is capable of describing the non-linear behaviour of geomaterials commonly observed before peak strength is reached as well as post-peak strain softening. Furthermore, it is demonstrated that the model can account for directional dependency of failure behaviour (i.e. anisotropy) and has the potential to be expanded to environmental controls like temperature or moisture.
Resumo:
This paper proposes a recommendation system that supports process participants in taking risk-informed decisions, with the goal of reducing risks that may arise during process execution. Risk reduction involves decreasing the likelihood and severity of a process fault from occurring. Given a business process exposed to risks, e.g. a financial process exposed to a risk of reputation loss, we enact this process and whenever a process participant needs to provide input to the process, e.g. by selecting the next task to execute or by filling out a form, we suggest to the participant the action to perform which minimizes the predicted process risk. Risks are predicted by traversing decision trees generated from the logs of past process executions, which consider process data, involved resources, task durations and other information elements like task frequencies. When applied in the context of multiple process instances running concurrently, a second technique is employed that uses integer linear programming to compute the optimal assignment of resources to tasks to be performed, in order to deal with the interplay between risks relative to different instances. The recommendation system has been implemented as a set of components on top of the YAWL BPM system and its effectiveness has been evaluated using a real-life scenario, in collaboration with risk analysts of a large insurance company. The results, based on a simulation of the real-life scenario and its comparison with the event data provided by the company, show that the process instances executed concurrently complete with significantly fewer faults and with lower fault severities, when the recommendations provided by our recommendation system are taken into account.
Resumo:
Welcome to the Evaluation of course matrix. This matrix is designed for highly qualified discipline experts to evaluate their course, major or unit in a systemic manner. The primary purpose of the Evaluation of course matrix is to provide a tool that a group of academic staff at universities can collaboratively review the assessment within a course, major or unit annually. The annual review will result in you being ready for an external curricula review at any point in time. This tool is designed for use in a workshop format with one, two or more academic staff, and will lead to an action plan for implementation. I hope you find this tool useful in your assessment review.
Resumo:
This pilot study aimed to evaluate the feasibility of a web-based self-management intervention in patients with heart failure. The study consisted of two phases including developing the web-based application and examining its feasibility in a group of heart failure patients. The results of this study were consistent with the current literature which has failed to show the benefits of web-based interventions for chronic disease self-management. In the current thesis, therefore, issues influencing the effectiveness of the web-based interventions were analysed. Recommendations for improving effectiveness of the web-based applications were also provided.
Resumo:
Rheological property of F-actin cytoskeleton is significant to the restructuring of cytoskeleton under a variety of cell activities. This study numerically validates the rheological property of F-actin cytoskeleton is not only a result of kinetic energy dissipation of F-actin, but also greatly depends on the configuration remodeling of networks structure. Both filament geometry and crosslinker properties can affect the remodeling of F-actin cytoskeleton. The crosslinker unbinding is found to dissipate energy and induce prominent stress relaxation in the F-actin adjacent to cross-linkages. Coupled with F-actin elasticity, the energy dissipation and stress relaxation are more significant in bundled F-actin networks than in single F-actin networks.
Resumo:
Alignment-free methods, in which shared properties of sub-sequences (e.g. identity or match length) are extracted and used to compute a distance matrix, have recently been explored for phylogenetic inference. However, the scalability and robustness of these methods to key evolutionary processes remain to be investigated. Here, using simulated sequence sets of various sizes in both nucleotides and amino acids, we systematically assess the accuracy of phylogenetic inference using an alignment-free approach, based on D2 statistics, under different evolutionary scenarios. We find that compared to a multiple sequence alignment approach, D2 methods are more robust against among-site rate heterogeneity, compositional biases, genetic rearrangements and insertions/deletions, but are more sensitive to recent sequence divergence and sequence truncation. Across diverse empirical datasets, the alignment-free methods perform well for sequences sharing low divergence, at greater computation speed. Our findings provide strong evidence for the scalability and the potential use of alignment-free methods in large-scale phylogenomics.
Resumo:
The autonomous capabilities in collaborative unmanned aircraft systems are growing rapidly. Without appropriate transparency, the effectiveness of the future multiple Unmanned Aerial Vehicle (UAV) management paradigm will be significantly limited by the human agent’s cognitive abilities; where the operator’s CognitiveWorkload (CW) and Situation Awareness (SA) will present as disproportionate. This proposes a challenge in evaluating the impact of robot autonomous capability feedback, allowing the human agent greater transparency into the robot’s autonomous status - in a supervisory role. This paper presents; the motivation, aim, related works, experiment theory, methodology, results and discussions, and the future work succeeding this preliminary study. The results in this paper illustrates that, with a greater transparency of a UAV’s autonomous capability, an overall improvement in the subjects’ cognitive abilities was evident, that is, with a confidence of 95%, the test subjects’ mean CW was demonstrated to have a statistically significant reduction, while their mean SA was demonstrated to have a significant increase.
Resumo:
The quality of environmental decisions should be gauged according to managers' objectives. Management objectives generally seek to maximize quantifiable measures of system benefit, for instance population growth rate. Reaching these goals often requires a certain degree of learning about the system. Learning can occur by using management action in combination with a monitoring system. Furthermore, actions can be chosen strategically to obtain specific kinds of information. Formal decision making tools can choose actions to favor such learning in two ways: implicitly via the optimization algorithm that is used when there is a management objective (for instance, when using adaptive management), or explicitly by quantifying knowledge and using it as the fundamental project objective, an approach new to conservation.This paper outlines three conservation project objectives - a pure management objective, a pure learning objective, and an objective that is a weighted mixture of these two. We use eight optimization algorithms to choose actions that meet project objectives and illustrate them in a simulated conservation project. The algorithms provide a taxonomy of decision making tools in conservation management when there is uncertainty surrounding competing models of system function. The algorithms build upon each other such that their differences are highlighted and practitioners may see where their decision making tools can be improved. © 2010 Elsevier Ltd.
Resumo:
Player experiences and expectations are connected. The presumptions players have about how they control their gameplay interactions may shape the way they play and perceive videogames. A successfully engaging player experience might rest on the way controllers meet players' expectations. We studied player interaction with novel controllers on the Sony PlayStation Wonderbook, an augmented reality (AR) gaming system. Our goal was to understand player expectations regarding game controllers in AR game design. Based on this preliminary study, we propose several interaction guidelines for hybrid input from both augmented reality and physical game controllers
Resumo:
Australia’s governance of land and natural resources involves multiple polycentric domains of decision-making from global through to local levels. Although certainly complex, these arrangements have not necessarily translated into better decision-making or better environmental outcomes as evidenced by the growing concerns over the health and future of the Great Barrier Reef, (GBR). However within this system, arrangements for natural resource management (NRM) and reef water quality, which both use Australia’s integrated regional NRM model, have showed signs of improving decision-making and environmental outcomes in the GBR. In this paper we describe the latest evolutions in the governance and planning for natural resource use and management in Australia. We begin by reviewing the experience with first generation NRM as published in major audits and evaluations. As our primary interest is the health and future of the GBR, we then consider the impact of changes of second generation planning and governance outcomes in Queensland. We find that first generation plans, although developed under a relatively cohesive governance context, faced substantial problems in target setting, implementation, monitoring and review. Despite this, they were able to progress improvements in water quality in the Great Barrier Reef Regions. Second generation plans, currently being developed, face an even greater risk of failure due to the lack of bilateralism and cross-sectoral cooperation across the NRM governance system. The findings highlight the critical need to re-build and enhance the regional NRM model for NRM planning to have a positive impact on environmental outcomes in the GBR.