918 resultados para probabilistic graphical model
Resumo:
Healthcare-associated methicillin-resistant Staphylococcus aureus(MRSA) infection may cause increased hospital stay or, sometimes, death. Quantifying this effect is complicated because it is a time-dependent exposure: infection may prolong hospital stay, while longer stays increase the risk of infection. We overcome these problems by using a multinomial longitudinal model for estimating the daily probability of death and discharge. We then extend the basic model to estimate how the effect of MRSA infection varies over time, and to quantify the number of excess ICU days due to infection. We find that infection decreases the relative risk of discharge (relative risk ratio = 0.68, 95% credible interval: 0.54, 0.82), but is only indirectly associated with increased mortality. An infection on the first day of admission resulted in a mean extra stay of 0.3 days (95% CI: 0.1, 0.5) for a patient with an APACHE II score of 10, and 1.2 days (95% CI: 0.5, 2.0) for a patient with an APACHE II score of 30. The decrease in the relative risk of discharge remained fairly constant with day of MRSA infection, but was slightly stronger closer to the start of infection. These results confirm the importance of MRSA infection in increasing ICU stay, but suggest that previous work may have systematically overestimated the effect size.
Resumo:
In the era of climate change sustainable urban development and in particular provision of sustainable urban infrastructure has become a key concept in dealing with environmental challenges. This paper discusses issues affecting stormwater quality and introduces a new indexing model that is to be used in evaluation of the stormwater quality in urban areas. The model has recently been developed and will be tested in a number of pilot projects in the Gold Coast, one of the fastest growing and environmentally challenged cities of Australia.
Resumo:
A bioactive and bioresorbable scaffold fabricated from medical grade poly (epsilon-caprolactone) and incorporating 20% beta-tricalcium phosphate (mPCL–TCP) was recently developed for bone regeneration at load bearing sites. In the present study, we aimed to evaluate bone ingrowth into mPCL–TCP in a large animal model of lumbar interbody fusion. Six pigs underwent a 2-level (L3/4; L5/6) anterior lumbar interbody fusion (ALIF) implanted with mPCL–TCP þ 0.6 mg rhBMP-2 as treatment group while four other pigs implanted with autogenous bone graft served as control. Computed tomographic scanning and histology revealed complete defect bridging in all (100%) specimen from the treatment group as early as 3 months. Histological evidence of continuing bone remodeling and maturation was observed at 6 months. In the control group, only partial bridging was observed at 3 months and only 50% of segments in this group showed complete defect bridging at 6 months. Furthermore, 25% of segments in the control group showed evidence of graft fracture, resorption and pseudoarthrosis. In contrast, no evidence of graft fractures, pseudoarthrosis or foreign body reaction was observed in the treatment group. These results reveal that mPCL–TCP scaffolds could act as bone graft substitutes by providing a suitable environment for bone regeneration in a dynamic load bearing setting such as in a porcine model of interbody spine fusion.
Resumo:
Currently, well-established clinical therapeutic approaches for bone reconstruction are restricted to the transplantation of autografts and allografts, and the implantation of metal devices or ceramic-based implants to assist bone regeneration. Bone grafts possess osteoconductive and osteoinductive properties, however they are limited in access and availability and associated with donor site morbidity, haemorrhage, risk of infection, insufficient transplant integration, graft devitalisation, and subsequent resorption resulting in decreased mechanical stability. As a result, recent research focuses on the development of alternative therapeutic concepts. The field of tissue engineering has emerged as an important approach to bone regeneration. However, bench to bedside translations are still infrequent as the process towards approval by regulatory bodies is protracted and costly, requiring both comprehensive in vitro and in vivo studies. The subsequent gap between research and clinical translation, hence commercialization, is referred to as the ‘Valley of Death’ and describes a large number of projects and/or ventures that are ceased due to a lack of funding during the transition from product/technology development to regulatory approval and subsequently commercialization. One of the greatest difficulties in bridging the Valley of Death is to develop good manufacturing processes (GMP) and scalable designs and to apply these in pre-clinical studies. In this article, we describe part of the rationale and road map of how our multidisciplinary research team has approached the first steps to translate orthopaedic bone engineering from bench to bedside byestablishing a pre-clinical ovine critical-sized tibial segmental bone defect model and discuss our preliminary data relating to this decisive step.
Resumo:
Queensland University of Technology (QUT) is faced with a rapidly growing research agenda built upon a strategic research capacity-building program. This presentation will outline the results of a project that has recently investigated QUT’s research support requirements and which has developed a model for the support of eResearch across the university. QUT’s research building strategy has produced growth at the faculty level and within its research institutes. This increased research activity is pushing the need for university-wide eResearch platforms capable of providing infrastructure and support in areas such as collaboration, data, networking, authentication and authorisation, workflows and the grid. One of the driving forces behind the investigation is data-centric nature of modern research. It is now critical that researchers have access to supported infrastructure that allows the collection, analysis, aggregation and sharing of large data volumes for exploration and mining in order to gain new insights and to generate new knowledge. However, recent surveys into current research data management practices by the Australian Partnership for Sustainable Repositories (APSR) and by QUT itself, has revealed serious shortcomings in areas such as research data management, especially its long term maintenance for reuse and authoritative evidence of research findings. While these internal university pressures are building, at the same time there are external pressures that are magnifying them. For example, recent compliance guidelines from bodies such as the ARC, and NHMRC and Universities Australia indicate that institutions need to provide facilities for the safe and secure storage of research data along with a surrounding set of policies, on its retention, ownership and accessibility. The newly formed Australian National Data Service (ANDS) is developing strategies and guidelines for research data management and research institutions are a central focus, responsible for managing and storing institutional data on platforms that can be federated nationally and internationally for wider use. For some time QUT has recognised the importance of eResearch and has been active in a number of related areas: ePrints to digitally publish research papers, grid computing portals and workflows, institutional-wide provisioning and authentication systems, and legal protocols for copyright management. QUT also has two widely recognised centres focused on fundamental research into eResearch itself: The OAK LAW project (Open Access to Knowledge) which focuses upon legal issues relating eResearch and the Microsoft QUT eResearch Centre whose goal is to accelerate scientific research discovery, through new smart software. In order to better harness all of these resources and improve research outcomes, the university recently established a project to investigate how it might better organise the support of eResearch. This presentation will outline the project outcomes, which include a flexible and sustainable eResearch support service model addressing short and longer term research needs, identification of resource requirements required to establish and sustain the service, and the development of research data management policies and implementation plans.
Resumo:
This paper argues for a future-oriented, inclusion of Engineering Model Eliciting Activities (EngMEAs) in elementary mathematics curricula. In EngMEAs students work with meaningful engineering problems that capitalise on and extend their existing mathematics and science learning, to develop, revise and document powerful models, while working in groups. The models developed by six groups of 12-year students in solving the Natural Gas activity are presented. Results showed that student models adequately solved the problem, although student models did not take into account all the data provided. Student solutions varied to the extent students employed the engineering context in their models and to their understanding of the mathematical concepts involved in the problem. Finally, recommendations for implementing EngMEAs and for further research are discussed.
Resumo:
In an empirical test and extension of Klein Conn and Sorra’s model of innovation implementation effectiveness, we apply structural equation modelling to identify the generalizability of their data-modified model in comparison with their theorised model. We examined the implementation of various types of innovations in a sample of 135 organizations. We found that the data supported the original model rather than the data-modified model, such that implementation climate mediated polices and practices and implementation effectiveness, while implementation effectiveness partially mediated the relationship between implementation climate and innovation effectiveness. Furthermore, we extend their model to suggest that non-financial resources availability plays a critical role in implementation policies and practices.
Resumo:
To navigate successfully in a previously unexplored environment, a mobile robot must be able to estimate the spatial relationships of the objects of interest accurately. A Simultaneous Localization and Mapping (SLAM) sys- tem employs its sensors to build incrementally a map of its surroundings and to localize itself in the map simultaneously. The aim of this research project is to develop a SLAM system suitable for self propelled household lawnmowers. The proposed bearing-only SLAM system requires only an omnidirec- tional camera and some inexpensive landmarks. The main advantage of an omnidirectional camera is the panoramic view of all the landmarks in the scene. Placing landmarks in a lawn field to define the working domain is much easier and more flexible than installing the perimeter wire required by existing autonomous lawnmowers. The common approach of existing bearing-only SLAM methods relies on a motion model for predicting the robot’s pose and a sensor model for updating the pose. In the motion model, the error on the estimates of object positions is cumulated due mainly to the wheel slippage. Quantifying accu- rately the uncertainty of object positions is a fundamental requirement. In bearing-only SLAM, the Probability Density Function (PDF) of landmark position should be uniform along the observed bearing. Existing methods that approximate the PDF with a Gaussian estimation do not satisfy this uniformity requirement. This thesis introduces both geometric and proba- bilistic methods to address the above problems. The main novel contribu- tions of this thesis are: 1. A bearing-only SLAM method not requiring odometry. The proposed method relies solely on the sensor model (landmark bearings only) without relying on the motion model (odometry). The uncertainty of the estimated landmark positions depends on the vision error only, instead of the combination of both odometry and vision errors. 2. The transformation of the spatial uncertainty of objects. This thesis introduces a novel method for translating the spatial un- certainty of objects estimated from a moving frame attached to the robot into the global frame attached to the static landmarks in the environment. 3. The characterization of an improved PDF for representing landmark position in bearing-only SLAM. The proposed PDF is expressed in polar coordinates, and the marginal probability on range is constrained to be uniform. Compared to the PDF estimated from a mixture of Gaussians, the PDF developed here has far fewer parameters and can be easily adopted in a probabilistic framework, such as a particle filtering system. The main advantages of our proposed bearing-only SLAM system are its lower production cost and flexibility of use. The proposed system can be adopted in other domestic robots as well, such as vacuum cleaners or robotic toys when terrain is essentially 2D.
Resumo:
Cooperative collision warning system for road vehicles, enabled by recent advances in positioning systems and wireless communication technologies, can potentially reduce traffic accident significantly. To improve the system, we propose a graph model to represent interactions between multiple road vehicles in a specific region and at a specific time. Given a list of vehicles in vicinity, we can generate the interaction graph using several rules that consider vehicle's properties such as position, speed, heading, etc. Safety applications can use the model to improve emergency warning accuracy and optimize wireless channel usage. The model allows us to develop some congestion control strategies for an efficient multi-hop broadcast protocol.
Resumo:
Ecological dynamics characterizes adaptive behavior as an emergent, self-organizing property of interpersonal interactions in complex social systems. The authors conceptualize and investigate constraints on dynamics of decisions and actions in the multiagent system of team sports. They studied coadaptive interpersonal dynamics in rugby union to model potential control parameter and collective variable relations in attacker–defender dyads. A videogrammetry analysis revealed how some agents generated fluctuations by adapting displacement velocity to create phase transitions and destabilize dyadic subsystems near the try line. Agent interpersonal dynamics exhibited characteristics of chaotic attractors and informational constraints of rugby union boxed dyadic systems into a low dimensional attractor. Data suggests that decisions and actions of agents in sports teams may be characterized as emergent, self-organizing properties, governed by laws of dynamical systems at the ecological scale. Further research needs to generalize this conceptual model of adaptive behavior in performance to other multiagent populations.
Resumo:
Modern Engineering Asset Management (EAM) requires the accurate assessment of current and the prediction of future asset health condition. Appropriate mathematical models that are capable of estimating times to failures and the probability of failures in the future are essential in EAM. In most real-life situations, the lifetime of an engineering asset is influenced and/or indicated by different factors that are termed as covariates. Hazard prediction with covariates is an elemental notion in the reliability theory to estimate the tendency of an engineering asset failing instantaneously beyond the current time assumed that it has already survived up to the current time. A number of statistical covariate-based hazard models have been developed. However, none of them has explicitly incorporated both external and internal covariates into one model. This paper introduces a novel covariate-based hazard model to address this concern. This model is named as Explicit Hazard Model (EHM). Both the semi-parametric and non-parametric forms of this model are presented in the paper. The major purpose of this paper is to illustrate the theoretical development of EHM. Due to page limitation, a case study with the reliability field data is presented in the applications part of this study.
Resumo:
Tested D. J. Kavanagh's (1983) depression model's explanation of response to cognitive-behavioral treatment among 19 20–60 yr old Ss who received treatment and 24 age-matched Ss who were assigned to a waiting list. Measures included the Beck Depression Inventory and self-efficacy (SE) and self-monitoring scales. Rises in SE and self-monitored performance of targeted skills were closely associated with the improved depression scores of treated Ss. Improvements in the depression of waiting list Ss occurred through random, uncontrolled events rather than via a systematic increase in specific skills targeted in treatment. SE regarding assertion also predicted depression scores over a 12-wk follow-up.
Resumo:
Hazard and reliability prediction of an engineering asset is one of the significant fields of research in Engineering Asset Health Management (EAHM). In real-life situations where an engineering asset operates under dynamic operational and environmental conditions, the lifetime of an engineering asset can be influenced and/or indicated by different factors that are termed as covariates. The Explicit Hazard Model (EHM) as a covariate-based hazard model is a new approach for hazard prediction which explicitly incorporates both internal and external covariates into one model. EHM is an appropriate model to use in the analysis of lifetime data in presence of both internal and external covariates in the reliability field. This paper presents applications of the methodology which is introduced and illustrated in the theory part of this study. In this paper, the semi-parametric EHM is applied to a case study so as to predict the hazard and reliability of resistance elements on a Resistance Corrosion Sensor Board (RCSB).
Resumo:
The purpose of this study was to identify the pedagogical knowledge relevant to the successful completion of a pie chart item. This purpose was achieved through the identification of the essential fluencies that 12–13-year-olds required for the successful solution of a pie chart item. Fluency relates to ease of solution and is particularly important in mathematics because it impacts on performance. Although the majority of students were successful on this multiple choice item, there was considerable divergence in the strategies they employed. Approximately two-thirds of the students employed efficient multiplicative strategies, which recognised and capitalised on the pie chart as a proportional representation. In contrast, the remaining one-third of students used a less efficient additive strategy that failed to capitalise on the representation of the pie chart. The results of our investigation of students’ performance on the pie chart item during individual interviews revealed that five distinct fluencies were involved in the solution process: conceptual (understanding the question), linguistic (keywords), retrieval (strategy selection), perceptual (orientation of a segment of the pie chart) and graphical (recognising the pie chart as a proportional representation). In addition, some students exhibited mild disfluencies corresponding to the five fluencies identified above. Three major outcomes emerged from the study. First, a model of knowledge of content and students for pie charts was developed. This model can be used to inform instruction about the pie chart and guide strategic support for students. Second, perceptual and graphical fluency were identified as two aspects of the curriculum, which should receive a greater emphasis in the primary years, due to their importance in interpreting pie charts. Finally, a working definition of fluency in mathematics was derived from students’ responses to the pie chart item.
Resumo:
There is increasing agreement that understanding complexity is important for project management because of difficulties associated with decision-making and goal attainment which appear to stem from complexity. However the current operational definitions of complex projects, based upon size and budget, have been challenged and questions have been raised about how complexity can be measured in a robust manner that takes account of structural, dynamic and interaction elements. Thematic analysis of data from 25 in-depth interviews of project managers involved with complex projects, together with an exploration of the literature reveals a wide range of factors that may contribute to project complexity. We argue that these factors contributing to project complexity may define in terms of dimensions, or source characteristics, which are in turn subject to a range of severity factors. In addition to investigating definitions and models of complexity from the literature and in the field, this study also explores the problematic issues of ‘measuring’ or assessing complexity. A research agenda is proposed to further the investigation of phenomena reported in this initial study.