461 resultados para Accelerated failure time model
Resumo:
Background Despite the widely recognised importance of sustainable health care systems, health services research remains generally underfunded in Australia. The Australian Centre for Health Services Innovation (AusHSI) is funding health services research in the state of Queensland. AusHSI has developed a streamlined protocol for applying and awarding funding using a short proposal and accelerated peer review. Method An observational study of proposals for four health services research funding rounds from May 2012 to November 2013. A short proposal of less than 1,200 words was submitted using a secure web-based portal. The primary outcome measures are: time spent preparing proposals; a simplified scoring of grant proposals (reject, revise or accept for interview) by a scientific review committee; and progressing from submission to funding outcomes within eight weeks. Proposals outside of health services research were deemed ineligible. Results There were 228 eligible proposals across 4 funding rounds: from 29% to 79% were shortlisted and 9% to 32% were accepted for interview. Success rates increased from 6% (in 2012) to 16% (in 2013) of eligible proposals. Applicants were notified of the outcomes within two weeks from the interview; which was a maximum of eight weeks after the submission deadline. Applicants spent 7 days on average preparing their proposal. Applicants with a ranking of reject or revise received written feedback and suggested improvements for their proposals, and resubmissions composed one third of the 2013 rounds. Conclusions The AusHSI funding scheme is a streamlined application process that has simplified the process of allocating health services research funding for both applicants and peer reviewers. The AusHSI process has minimised the time from submission to notification of funding outcomes.
Resumo:
Background: This study attempted to develop health risk-based metrics for defining a heatwave in Brisbane, Australia. Methods: Poisson generalised additive model was performed to assess the impact of heatwaves on mortality and emergency hospital admissions (EHAs) in Brisbane. Results: In general, the higher the intensity and the longer the duration of a heatwave, the greater the health impacts. There was no apparent difference in EHAs risk during different periods of a warm season. However, there was a greater risk of mortality in the second half of a warm season than that in the first half. While elderly (>75 years)were particularly vulnerable to both the EHA and mortality effects of a heatwave, the risk for EHAs also significantly increased for two other age groups (0-64 years and 65-74 years) during severe heatwaves. Different patterns between cardiorespiratory mortality and EHAs were observed. Based on these findings, we propose the use of a teiered heat warning system based on the health risk of heatwave. Conclusions: Health risk-based metrics are a useful tool for the development of local heatwave definitions. thsi tool may have significant implications for the assessment of heatwave-related health consequences and development of heatwave response plans and implementation strategies.
An external field prior for the hidden Potts model with application to cone-beam computed tomography
Resumo:
In images with low contrast-to-noise ratio (CNR), the information gain from the observed pixel values can be insufficient to distinguish foreground objects. A Bayesian approach to this problem is to incorporate prior information about the objects into a statistical model. A method for representing spatial prior information as an external field in a hidden Potts model is introduced. This prior distribution over the latent pixel labels is a mixture of Gaussian fields, centred on the positions of the objects at a previous point in time. It is particularly applicable in longitudinal imaging studies, where the manual segmentation of one image can be used as a prior for automatic segmentation of subsequent images. The method is demonstrated by application to cone-beam computed tomography (CT), an imaging modality that exhibits distortions in pixel values due to X-ray scatter. The external field prior results in a substantial improvement in segmentation accuracy, reducing the mean pixel misclassification rate for an electron density phantom from 87% to 6%. The method is also applied to radiotherapy patient data, demonstrating how to derive the external field prior in a clinical context.
Resumo:
1. In conservation decision-making, we operate within the confines of limited funding. Furthermore, we often assume particular relationships between management impact and our investment in management. The structure of these relationships, however, is rarely known with certainty - there is model uncertainty. We investigate how these two fundamentally limiting factors in conservation management, money and knowledge, impact optimal decision-making. 2. We use information-gap decision theory to find strategies for maximizing the number of extant subpopulations of a threatened species that are most immune to failure due to model uncertainty. We thus find a robust framework for exploring optimal decision-making. 3. The performance of every strategy decreases as model uncertainty increases. 4. The strategy most robust to model uncertainty depends not only on what performance is perceived to be acceptable but also on available funding and the time horizon over which extinction is considered. 5. Synthesis and applications. We investigate the impact of model uncertainty on robust decision-making in conservation and how this is affected by available conservation funding. We show that subpopulation triage can be a natural consequence of robust decision-making. We highlight the need for managers to consider triage not as merely giving up, but as a tool for ensuring species persistence in light of the urgency of most conservation requirements, uncertainty and the poor state of conservation funding. We illustrate this theory by a specific application to allocation of funding to reduce poaching impact on the Sumatran tiger Panthera tigris sumatrae in Kerinci Seblat National Park. © 2008 The Authors.
Resumo:
Stochastic modelling is critical in GNSS data processing. Currently, GNSS data processing commonly relies on the empirical stochastic model which may not reflect the actual data quality or noise characteristics. This paper examines the real-time GNSS observation noise estimation methods enabling to determine the observation variance from single receiver data stream. The methods involve three steps: forming linear combination, handling the ionosphere and ambiguity bias and variance estimation. Two distinguished ways are applied to overcome the ionosphere and ambiguity biases, known as the time differenced method and polynomial prediction method respectively. The real time variance estimation methods are compared with the zero-baseline and short-baseline methods. The proposed method only requires single receiver observation, thus applicable to both differenced and un-differenced data processing modes. However, the methods may be subject to the normal ionosphere conditions and low autocorrelation GNSS receivers. Experimental results also indicate the proposed method can result on more realistic parameter precision.
Resumo:
Background Recent advances in Immunology highlighted the importance of local properties on the overall progression of HIV infection. In particular, the gastrointestinal tract is seen as a key area during early infection, and the massive cell depletion associated with it may influence subsequent disease progression. This motivated the development of a large-scale agent-based model. Results Lymph nodes are explicitly implemented, and considerations on parallel computing permit large simulations and the inclusion of local features. The results obtained show that GI tract inclusion in the model leads to an accelerated disease progression, during both the early stages and the long-term evolution, compared to a theoretical, uniform model. Conclusions These results confirm the potential of treatment policies currently under investigation, which focus on this region. They also highlight the potential of this modelling framework, incorporating both agent-based and network-based components, in the context of complex systems where scaling-up alone does not result in models providing additional insights.
Resumo:
The fractional Fokker-Planck equation is an important physical model for simulating anomalous diffusions with external forces. Because of the non-local property of the fractional derivative an interesting problem is to explore high accuracy numerical methods for fractional differential equations. In this paper, a space-time spectral method is presented for the numerical solution of the time fractional Fokker-Planck initial-boundary value problem. The proposed method employs the Jacobi polynomials for the temporal discretization and Fourier-like basis functions for the spatial discretization. Due to the diagonalizable trait of the Fourier-like basis functions, this leads to a reduced representation of the inner product in the Galerkin analysis. We prove that the time fractional Fokker-Planck equation attains the same approximation order as the time fractional diffusion equation developed in [23] by using the present method. That indicates an exponential decay may be achieved if the exact solution is sufficiently smooth. Finally, some numerical results are given to demonstrate the high order accuracy and efficiency of the new numerical scheme. The results show that the errors of the numerical solutions obtained by the space-time spectral method decay exponentially.
Resumo:
This thesis aims at studying the structural behaviour of high bond strength masonry shear walls by developing a combined interface and surface contact model. The results are further verified by a cost-effective structural level model which was then extensively used for predicting all possible failure modes of high bond strength masonry shear walls. It is concluded that the increase in bond strength of masonry modifies the failure mode from diagonal cracking to base sliding and doesn't proportionally increase the in-plane shear capacity. This can be overcome by increasing pre-compression pressure which causes failure through blocks. A design equation is proposed and high bond strength masonry is recommended for taller buildings and/ or pre-stressed masonry applications.
Resumo:
This paper discusses the Coordinated Family Dispute Resolution (family mediation) process piloted in Australia in 2010–2012. This process was evaluated by the Australian Institute of Family Studies as being ‘at the cutting edge of family law practice’ because it involves the conscious application of mediation where there has been a history of family violence, in a clinically collaborative multidisciplinary and multi-agency setting. The Australian government’s failure to invest resources in the ongoing funding of this model jeopardises the safety and efficacy of family dispute resolution practice in family violence contexts, and compromises the hearing of the voices of family violence victims and their children.
Resumo:
Recent changes in the aviation industry and in the expectations of travellers have begun to alter the way we approach our understanding, and thus the segmentation, of airport passengers. The key to successful segmentation of any population lies in the selection of the criteria on which the partitions are based. Increasingly, the basic criteria used to segment passengers (purpose of trip and frequency of travel) no longer provide adequate insights into the passenger experience. In this paper, we propose a new model for passenger segmentation based on the passenger core value, time. The results are based on qualitative research conducted in-situ at Brisbane International Terminal during 2012-2013. Based on our research, a relationship between time sensitivity and degree of passenger engagement was identified. This relationship was used as the basis for a new passenger segmentation model, namely: Airport Enthusiast (engaged, non time sensitive); Time Filler (non engaged, non time sensitive); Efficiency Lover (non engaged, time sensitive) and Efficient Enthusiast (engaged, time sensitive). The outcomes of this research extend the theoretical knowledge about passenger experience in the terminal environment. These new insights can ultimately be used to optimise the allocation of space for future terminal planning and design.
Resumo:
The development of methods for real-time crash prediction as a function of current or recent traffic and roadway conditions is gaining increasing attention in the literature. Numerous studies have modeled the relationships between traffic characteristics and crash occurrence, and significant progress has been made. Given the accumulated evidence on this topic and the lack of an articulate summary of research status, challenges, and opportunities, there is an urgent need to scientifically review these studies and to synthesize the existing state-of-the-art knowledge. This paper addresses this need by undertaking a systematic literature review to identify current knowledge, challenges, and opportunities, and then conducts a meta-analysis of existing studies to provide a summary impact of traffic characteristics on crash occurrence. Sensitivity analyses were conducted to assess quality, publication bias, and outlier bias of the various studies; and the time intervals used to measure traffic characteristics were also considered. As a result of this comprehensive and systematic review, issues in study designs, traffic and crash data, and model development and validation are discussed. Outcomes of this study are intended to provide researchers focused on real-time crash prediction with greater insight into the modeling of this important but extremely challenging safety issue.
Resumo:
Many organizations realize that increasing amounts of data (“Big Data”) need to be dealt with intelligently in order to compete with other organizations in terms of efficiency, speed and services. The goal is not to collect as much data as possible, but to turn event data into valuable insights that can be used to improve business processes. However, data-oriented analysis approaches fail to relate event data to process models. At the same time, large organizations are generating piles of process models that are disconnected from the real processes and information systems. In this chapter we propose to manage large collections of process models and event data in an integrated manner. Observed and modeled behavior need to be continuously compared and aligned. This results in a “liquid” business process model collection, i.e. a collection of process models that is in sync with the actual organizational behavior. The collection should self-adapt to evolving organizational behavior and incorporate relevant execution data (e.g. process performance and resource utilization) extracted from the logs, thereby allowing insightful reports to be produced from factual organizational data.
Resumo:
This paper introduces the smooth transition logit (STL) model that is designed to detect and model situations in which there is structural change in the behaviour underlying the latent index from which the binary dependent variable is constructed. The maximum likelihood estimators of the parameters of the model are derived along with their asymptotic properties, together with a Lagrange multiplier test of the null hypothesis of linearity in the underlying latent index. The development of the STL model is motivated by the desire to assess the impact of deregulation in the Queensland electricity market and ascertain whether increased competition has resulted in significant changes in the behaviour of the spot price of electricity, specifically with respect to the occurrence of periodic abnormally high prices. The model allows the timing of any change to be endogenously determined and also market participants' behaviour to change gradually over time. The main results provide clear evidence in support of a structural change in the nature of price events, and the endogenously determined timing of the change is consistent with the process of deregulation in Queensland.
Resumo:
This paper presents a novel path planning method for minimizing the energy consumption of an autonomous underwater vehicle subjected to time varying ocean disturbances and forecast model uncertainty. The algorithm determines 4-Dimensional path candidates using Nonlinear Robust Model Predictive Control (NRMPC) and solutions optimised using A*-like algorithms. Vehicle performance limits are incorporated into the algorithm with disturbances represented as spatial and temporally varying ocean currents with a bounded uncertainty in their predictions. The proposed algorithm is demonstrated through simulations using a 4-Dimensional, spatially distributed time-series predictive ocean current model. Results show the combined NRMPC and A* approach is capable of generating energy-efficient paths which are resistant to both dynamic disturbances and ocean model uncertainty.
Resumo:
Purpose Performance heterogeneity between collaborative infrastructure projects is typically examined by considering procurement systems and their governance mechanisms at static points in time. The literature neglects to consider the impact of dynamic learning capability, which is thought to reconfigure governance mechanisms over time in response to evolving market conditions. This conceptual paper proposes a new model to show how continuous joint learning of participant organisations improves project performance. Design/methodology/approach There are two stages of conceptual development. In the first stage, the management literature is analysed to explain the Standard Model of dynamic learning capability that emphasises three learning phases for organisations. This Standard Model is extended to derive a novel Circular Model of dynamic learning capability that shows a new feedback loop between performance and learning. In the second stage, the construction management literature is consulted, adding project lifecycle, stakeholder diversity and three organisational levels to the analysis, to arrive at the Collaborative Model of dynamic learning capability. Findings The Collaborative Model should enable construction organisations to successfully adapt and perform under changing market conditions. The complexity of learning cycles results in capabilities that are imperfectly imitable between organisations, explaining performance heterogeneity on projects. Originality/value The Collaborative Model provides a theoretically substantiated description of project performance, driven by the evolution of procurement systems and governance mechanisms. The Model’s empirical value will be tested in future research.