914 resultados para Accelerated failure time model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. In conservation decision-making, we operate within the confines of limited funding. Furthermore, we often assume particular relationships between management impact and our investment in management. The structure of these relationships, however, is rarely known with certainty - there is model uncertainty. We investigate how these two fundamentally limiting factors in conservation management, money and knowledge, impact optimal decision-making. 2. We use information-gap decision theory to find strategies for maximizing the number of extant subpopulations of a threatened species that are most immune to failure due to model uncertainty. We thus find a robust framework for exploring optimal decision-making. 3. The performance of every strategy decreases as model uncertainty increases. 4. The strategy most robust to model uncertainty depends not only on what performance is perceived to be acceptable but also on available funding and the time horizon over which extinction is considered. 5. Synthesis and applications. We investigate the impact of model uncertainty on robust decision-making in conservation and how this is affected by available conservation funding. We show that subpopulation triage can be a natural consequence of robust decision-making. We highlight the need for managers to consider triage not as merely giving up, but as a tool for ensuring species persistence in light of the urgency of most conservation requirements, uncertainty and the poor state of conservation funding. We illustrate this theory by a specific application to allocation of funding to reduce poaching impact on the Sumatran tiger Panthera tigris sumatrae in Kerinci Seblat National Park. © 2008 The Authors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stochastic modelling is critical in GNSS data processing. Currently, GNSS data processing commonly relies on the empirical stochastic model which may not reflect the actual data quality or noise characteristics. This paper examines the real-time GNSS observation noise estimation methods enabling to determine the observation variance from single receiver data stream. The methods involve three steps: forming linear combination, handling the ionosphere and ambiguity bias and variance estimation. Two distinguished ways are applied to overcome the ionosphere and ambiguity biases, known as the time differenced method and polynomial prediction method respectively. The real time variance estimation methods are compared with the zero-baseline and short-baseline methods. The proposed method only requires single receiver observation, thus applicable to both differenced and un-differenced data processing modes. However, the methods may be subject to the normal ionosphere conditions and low autocorrelation GNSS receivers. Experimental results also indicate the proposed method can result on more realistic parameter precision.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Recent advances in Immunology highlighted the importance of local properties on the overall progression of HIV infection. In particular, the gastrointestinal tract is seen as a key area during early infection, and the massive cell depletion associated with it may influence subsequent disease progression. This motivated the development of a large-scale agent-based model. Results Lymph nodes are explicitly implemented, and considerations on parallel computing permit large simulations and the inclusion of local features. The results obtained show that GI tract inclusion in the model leads to an accelerated disease progression, during both the early stages and the long-term evolution, compared to a theoretical, uniform model. Conclusions These results confirm the potential of treatment policies currently under investigation, which focus on this region. They also highlight the potential of this modelling framework, incorporating both agent-based and network-based components, in the context of complex systems where scaling-up alone does not result in models providing additional insights.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The fractional Fokker-Planck equation is an important physical model for simulating anomalous diffusions with external forces. Because of the non-local property of the fractional derivative an interesting problem is to explore high accuracy numerical methods for fractional differential equations. In this paper, a space-time spectral method is presented for the numerical solution of the time fractional Fokker-Planck initial-boundary value problem. The proposed method employs the Jacobi polynomials for the temporal discretization and Fourier-like basis functions for the spatial discretization. Due to the diagonalizable trait of the Fourier-like basis functions, this leads to a reduced representation of the inner product in the Galerkin analysis. We prove that the time fractional Fokker-Planck equation attains the same approximation order as the time fractional diffusion equation developed in [23] by using the present method. That indicates an exponential decay may be achieved if the exact solution is sufficiently smooth. Finally, some numerical results are given to demonstrate the high order accuracy and efficiency of the new numerical scheme. The results show that the errors of the numerical solutions obtained by the space-time spectral method decay exponentially.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis aims at studying the structural behaviour of high bond strength masonry shear walls by developing a combined interface and surface contact model. The results are further verified by a cost-effective structural level model which was then extensively used for predicting all possible failure modes of high bond strength masonry shear walls. It is concluded that the increase in bond strength of masonry modifies the failure mode from diagonal cracking to base sliding and doesn't proportionally increase the in-plane shear capacity. This can be overcome by increasing pre-compression pressure which causes failure through blocks. A design equation is proposed and high bond strength masonry is recommended for taller buildings and/ or pre-stressed masonry applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses the Coordinated Family Dispute Resolution (family mediation) process piloted in Australia in 2010–2012. This process was evaluated by the Australian Institute of Family Studies as being ‘at the cutting edge of family law practice’ because it involves the conscious application of mediation where there has been a history of family violence, in a clinically collaborative multidisciplinary and multi-agency setting. The Australian government’s failure to invest resources in the ongoing funding of this model jeopardises the safety and efficacy of family dispute resolution practice in family violence contexts, and compromises the hearing of the voices of family violence victims and their children.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent changes in the aviation industry and in the expectations of travellers have begun to alter the way we approach our understanding, and thus the segmentation, of airport passengers. The key to successful segmentation of any population lies in the selection of the criteria on which the partitions are based. Increasingly, the basic criteria used to segment passengers (purpose of trip and frequency of travel) no longer provide adequate insights into the passenger experience. In this paper, we propose a new model for passenger segmentation based on the passenger core value, time. The results are based on qualitative research conducted in-situ at Brisbane International Terminal during 2012-2013. Based on our research, a relationship between time sensitivity and degree of passenger engagement was identified. This relationship was used as the basis for a new passenger segmentation model, namely: Airport Enthusiast (engaged, non time sensitive); Time Filler (non engaged, non time sensitive); Efficiency Lover (non engaged, time sensitive) and Efficient Enthusiast (engaged, time sensitive). The outcomes of this research extend the theoretical knowledge about passenger experience in the terminal environment. These new insights can ultimately be used to optimise the allocation of space for future terminal planning and design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of methods for real-time crash prediction as a function of current or recent traffic and roadway conditions is gaining increasing attention in the literature. Numerous studies have modeled the relationships between traffic characteristics and crash occurrence, and significant progress has been made. Given the accumulated evidence on this topic and the lack of an articulate summary of research status, challenges, and opportunities, there is an urgent need to scientifically review these studies and to synthesize the existing state-of-the-art knowledge. This paper addresses this need by undertaking a systematic literature review to identify current knowledge, challenges, and opportunities, and then conducts a meta-analysis of existing studies to provide a summary impact of traffic characteristics on crash occurrence. Sensitivity analyses were conducted to assess quality, publication bias, and outlier bias of the various studies; and the time intervals used to measure traffic characteristics were also considered. As a result of this comprehensive and systematic review, issues in study designs, traffic and crash data, and model development and validation are discussed. Outcomes of this study are intended to provide researchers focused on real-time crash prediction with greater insight into the modeling of this important but extremely challenging safety issue.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many organizations realize that increasing amounts of data (“Big Data”) need to be dealt with intelligently in order to compete with other organizations in terms of efficiency, speed and services. The goal is not to collect as much data as possible, but to turn event data into valuable insights that can be used to improve business processes. However, data-oriented analysis approaches fail to relate event data to process models. At the same time, large organizations are generating piles of process models that are disconnected from the real processes and information systems. In this chapter we propose to manage large collections of process models and event data in an integrated manner. Observed and modeled behavior need to be continuously compared and aligned. This results in a “liquid” business process model collection, i.e. a collection of process models that is in sync with the actual organizational behavior. The collection should self-adapt to evolving organizational behavior and incorporate relevant execution data (e.g. process performance and resource utilization) extracted from the logs, thereby allowing insightful reports to be produced from factual organizational data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces the smooth transition logit (STL) model that is designed to detect and model situations in which there is structural change in the behaviour underlying the latent index from which the binary dependent variable is constructed. The maximum likelihood estimators of the parameters of the model are derived along with their asymptotic properties, together with a Lagrange multiplier test of the null hypothesis of linearity in the underlying latent index. The development of the STL model is motivated by the desire to assess the impact of deregulation in the Queensland electricity market and ascertain whether increased competition has resulted in significant changes in the behaviour of the spot price of electricity, specifically with respect to the occurrence of periodic abnormally high prices. The model allows the timing of any change to be endogenously determined and also market participants' behaviour to change gradually over time. The main results provide clear evidence in support of a structural change in the nature of price events, and the endogenously determined timing of the change is consistent with the process of deregulation in Queensland.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a novel path planning method for minimizing the energy consumption of an autonomous underwater vehicle subjected to time varying ocean disturbances and forecast model uncertainty. The algorithm determines 4-Dimensional path candidates using Nonlinear Robust Model Predictive Control (NRMPC) and solutions optimised using A*-like algorithms. Vehicle performance limits are incorporated into the algorithm with disturbances represented as spatial and temporally varying ocean currents with a bounded uncertainty in their predictions. The proposed algorithm is demonstrated through simulations using a 4-Dimensional, spatially distributed time-series predictive ocean current model. Results show the combined NRMPC and A* approach is capable of generating energy-efficient paths which are resistant to both dynamic disturbances and ocean model uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose Performance heterogeneity between collaborative infrastructure projects is typically examined by considering procurement systems and their governance mechanisms at static points in time. The literature neglects to consider the impact of dynamic learning capability, which is thought to reconfigure governance mechanisms over time in response to evolving market conditions. This conceptual paper proposes a new model to show how continuous joint learning of participant organisations improves project performance. Design/methodology/approach There are two stages of conceptual development. In the first stage, the management literature is analysed to explain the Standard Model of dynamic learning capability that emphasises three learning phases for organisations. This Standard Model is extended to derive a novel Circular Model of dynamic learning capability that shows a new feedback loop between performance and learning. In the second stage, the construction management literature is consulted, adding project lifecycle, stakeholder diversity and three organisational levels to the analysis, to arrive at the Collaborative Model of dynamic learning capability. Findings The Collaborative Model should enable construction organisations to successfully adapt and perform under changing market conditions. The complexity of learning cycles results in capabilities that are imperfectly imitable between organisations, explaining performance heterogeneity on projects. Originality/value The Collaborative Model provides a theoretically substantiated description of project performance, driven by the evolution of procurement systems and governance mechanisms. The Model’s empirical value will be tested in future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The efficient computation of matrix function vector products has become an important area of research in recent times, driven in particular by two important applications: the numerical solution of fractional partial differential equations and the integration of large systems of ordinary differential equations. In this work we consider a problem that combines these two applications, in the form of a numerical solution algorithm for fractional reaction diffusion equations that after spatial discretisation, is advanced in time using the exponential Euler method. We focus on the efficient implementation of the algorithm on Graphics Processing Units (GPU), as we wish to make use of the increased computational power available with this hardware. We compute the matrix function vector products using the contour integration method in [N. Hale, N. Higham, and L. Trefethen. Computing Aα, log(A), and related matrix functions by contour integrals. SIAM J. Numer. Anal., 46(5):2505–2523, 2008]. Multiple levels of preconditioning are applied to reduce the GPU memory footprint and to further accelerate convergence. We also derive an error bound for the convergence of the contour integral method that allows us to pre-determine the appropriate number of quadrature points. Results are presented that demonstrate the effectiveness of the method for large two-dimensional problems, showing a speedup of more than an order of magnitude compared to a CPU-only implementation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research on attrition has focused on the economic significance of low graduation rates in terms of costs to students (fees that do not culminate in a credential) and impact on future income. For a student who fails a unit and repeats the unit multiple times, the financial impact is significant and lasting (Bexley, Daroesman, Arkoudis & James 2013). There are obvious advantages for the timely completion of a degree, both for the student and the institution. Advantages to students include fee minimisation, enhanced engagement opportunities, effectual pathway to employment and a sense of worth, morale and cohort-identity benefits. Work undertaken by the QUT Analytics Project in 2013 and 2014 explored student engagement patterns capturing a variety of data sources and specifically, the use of LMS amongst students in 804 undergraduate units in one semester. Units with high failure rates were given further attention and it was found that students who were repeating a unit were less likely to pass the unit than students attempting it for the first time. In this repeating cohort, academic and behavioural variables were consistently more significant in the modelling than were any demographic variables, indicating that a student’s performance at university is far more impacted by what they do once they arrive than it is by where they come from. The aim of this poster session is to examine the findings and commonalities of a number of case studies that articulated the engagement activities of repeating students (which included collating data from Individual Unit Reports, academic and peer advising programs and engagement with virtual learning resources). Understanding the profile of the repeating student cohort is therefore as important as considering the characteristics of successful students so that the institution might be better placed to target the repeating students and make proactive interventions as early as possible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The mining industry presents us with a number of ideal applications for sensor based machine control because of the unstructured environment that exists within each mine. The aim of the research presented here is to increase the productivity of existing large compliant mining machines by retrofitting with enhanced sensing and control technology. The current research focusses on the automatic control of the swing motion cycle of a dragline and an automated roof bolting system. We have achieved: * closed-loop swing control of an one-tenth scale model dragline; * single degree of freedom closed-loop visual control of an electro-hydraulic manipulator in the lab developed from standard components.