361 resultados para Tilted-time window model
An external field prior for the hidden Potts model with application to cone-beam computed tomography
Resumo:
In images with low contrast-to-noise ratio (CNR), the information gain from the observed pixel values can be insufficient to distinguish foreground objects. A Bayesian approach to this problem is to incorporate prior information about the objects into a statistical model. A method for representing spatial prior information as an external field in a hidden Potts model is introduced. This prior distribution over the latent pixel labels is a mixture of Gaussian fields, centred on the positions of the objects at a previous point in time. It is particularly applicable in longitudinal imaging studies, where the manual segmentation of one image can be used as a prior for automatic segmentation of subsequent images. The method is demonstrated by application to cone-beam computed tomography (CT), an imaging modality that exhibits distortions in pixel values due to X-ray scatter. The external field prior results in a substantial improvement in segmentation accuracy, reducing the mean pixel misclassification rate for an electron density phantom from 87% to 6%. The method is also applied to radiotherapy patient data, demonstrating how to derive the external field prior in a clinical context.
Resumo:
Stochastic modelling is critical in GNSS data processing. Currently, GNSS data processing commonly relies on the empirical stochastic model which may not reflect the actual data quality or noise characteristics. This paper examines the real-time GNSS observation noise estimation methods enabling to determine the observation variance from single receiver data stream. The methods involve three steps: forming linear combination, handling the ionosphere and ambiguity bias and variance estimation. Two distinguished ways are applied to overcome the ionosphere and ambiguity biases, known as the time differenced method and polynomial prediction method respectively. The real time variance estimation methods are compared with the zero-baseline and short-baseline methods. The proposed method only requires single receiver observation, thus applicable to both differenced and un-differenced data processing modes. However, the methods may be subject to the normal ionosphere conditions and low autocorrelation GNSS receivers. Experimental results also indicate the proposed method can result on more realistic parameter precision.
Resumo:
The fractional Fokker-Planck equation is an important physical model for simulating anomalous diffusions with external forces. Because of the non-local property of the fractional derivative an interesting problem is to explore high accuracy numerical methods for fractional differential equations. In this paper, a space-time spectral method is presented for the numerical solution of the time fractional Fokker-Planck initial-boundary value problem. The proposed method employs the Jacobi polynomials for the temporal discretization and Fourier-like basis functions for the spatial discretization. Due to the diagonalizable trait of the Fourier-like basis functions, this leads to a reduced representation of the inner product in the Galerkin analysis. We prove that the time fractional Fokker-Planck equation attains the same approximation order as the time fractional diffusion equation developed in [23] by using the present method. That indicates an exponential decay may be achieved if the exact solution is sufficiently smooth. Finally, some numerical results are given to demonstrate the high order accuracy and efficiency of the new numerical scheme. The results show that the errors of the numerical solutions obtained by the space-time spectral method decay exponentially.
Resumo:
Recent changes in the aviation industry and in the expectations of travellers have begun to alter the way we approach our understanding, and thus the segmentation, of airport passengers. The key to successful segmentation of any population lies in the selection of the criteria on which the partitions are based. Increasingly, the basic criteria used to segment passengers (purpose of trip and frequency of travel) no longer provide adequate insights into the passenger experience. In this paper, we propose a new model for passenger segmentation based on the passenger core value, time. The results are based on qualitative research conducted in-situ at Brisbane International Terminal during 2012-2013. Based on our research, a relationship between time sensitivity and degree of passenger engagement was identified. This relationship was used as the basis for a new passenger segmentation model, namely: Airport Enthusiast (engaged, non time sensitive); Time Filler (non engaged, non time sensitive); Efficiency Lover (non engaged, time sensitive) and Efficient Enthusiast (engaged, time sensitive). The outcomes of this research extend the theoretical knowledge about passenger experience in the terminal environment. These new insights can ultimately be used to optimise the allocation of space for future terminal planning and design.
Resumo:
The development of methods for real-time crash prediction as a function of current or recent traffic and roadway conditions is gaining increasing attention in the literature. Numerous studies have modeled the relationships between traffic characteristics and crash occurrence, and significant progress has been made. Given the accumulated evidence on this topic and the lack of an articulate summary of research status, challenges, and opportunities, there is an urgent need to scientifically review these studies and to synthesize the existing state-of-the-art knowledge. This paper addresses this need by undertaking a systematic literature review to identify current knowledge, challenges, and opportunities, and then conducts a meta-analysis of existing studies to provide a summary impact of traffic characteristics on crash occurrence. Sensitivity analyses were conducted to assess quality, publication bias, and outlier bias of the various studies; and the time intervals used to measure traffic characteristics were also considered. As a result of this comprehensive and systematic review, issues in study designs, traffic and crash data, and model development and validation are discussed. Outcomes of this study are intended to provide researchers focused on real-time crash prediction with greater insight into the modeling of this important but extremely challenging safety issue.
Resumo:
Many organizations realize that increasing amounts of data (“Big Data”) need to be dealt with intelligently in order to compete with other organizations in terms of efficiency, speed and services. The goal is not to collect as much data as possible, but to turn event data into valuable insights that can be used to improve business processes. However, data-oriented analysis approaches fail to relate event data to process models. At the same time, large organizations are generating piles of process models that are disconnected from the real processes and information systems. In this chapter we propose to manage large collections of process models and event data in an integrated manner. Observed and modeled behavior need to be continuously compared and aligned. This results in a “liquid” business process model collection, i.e. a collection of process models that is in sync with the actual organizational behavior. The collection should self-adapt to evolving organizational behavior and incorporate relevant execution data (e.g. process performance and resource utilization) extracted from the logs, thereby allowing insightful reports to be produced from factual organizational data.
Resumo:
This paper examines the feasibility of using vertical light pipes to naturally illuminate the central core of a multilevel building not reached by window light. The challenges addressed were finding a method to extract and distribute equal amounts of light at each level and designing collectors to improve the effectiveness of vertical light pipes in delivering low elevation sunlight to the interior. Extraction was achieved by inserting partially reflecting cones within transparent sections of the pipes at each floor level. Theory was formulated to estimate the partial reflectance necessary to provide equal light extraction at each level. Designs for daylight collectors formed from laser cut panels tilted above the light pipe were developed and the benefits and limitations of static collectors as opposed to collectors that follow the sun azimuth investigated. Performance was assessed with both basic and detailed mathematical simulation and by observations made with a five level model building under clear sky conditions.
Resumo:
An effective prognostics program will provide ample lead time for maintenance engineers to schedule a repair and to acquire replacement components before catastrophic failures occur. This paper presents a technique for accurate assessment of the remnant life of machines based on health state probability estimation technique. For comparative study of the proposed model with the proportional hazard model (PHM), experimental bearing failure data from an accelerated bearing test rig were used. The result shows that the proposed prognostic model based on health state probability estimation can provide a more accurate prediction capability than the commonly used PHM in bearing failure case study.
Resumo:
This paper introduces the smooth transition logit (STL) model that is designed to detect and model situations in which there is structural change in the behaviour underlying the latent index from which the binary dependent variable is constructed. The maximum likelihood estimators of the parameters of the model are derived along with their asymptotic properties, together with a Lagrange multiplier test of the null hypothesis of linearity in the underlying latent index. The development of the STL model is motivated by the desire to assess the impact of deregulation in the Queensland electricity market and ascertain whether increased competition has resulted in significant changes in the behaviour of the spot price of electricity, specifically with respect to the occurrence of periodic abnormally high prices. The model allows the timing of any change to be endogenously determined and also market participants' behaviour to change gradually over time. The main results provide clear evidence in support of a structural change in the nature of price events, and the endogenously determined timing of the change is consistent with the process of deregulation in Queensland.
Resumo:
This paper presents a novel path planning method for minimizing the energy consumption of an autonomous underwater vehicle subjected to time varying ocean disturbances and forecast model uncertainty. The algorithm determines 4-Dimensional path candidates using Nonlinear Robust Model Predictive Control (NRMPC) and solutions optimised using A*-like algorithms. Vehicle performance limits are incorporated into the algorithm with disturbances represented as spatial and temporally varying ocean currents with a bounded uncertainty in their predictions. The proposed algorithm is demonstrated through simulations using a 4-Dimensional, spatially distributed time-series predictive ocean current model. Results show the combined NRMPC and A* approach is capable of generating energy-efficient paths which are resistant to both dynamic disturbances and ocean model uncertainty.
Resumo:
Purpose Performance heterogeneity between collaborative infrastructure projects is typically examined by considering procurement systems and their governance mechanisms at static points in time. The literature neglects to consider the impact of dynamic learning capability, which is thought to reconfigure governance mechanisms over time in response to evolving market conditions. This conceptual paper proposes a new model to show how continuous joint learning of participant organisations improves project performance. Design/methodology/approach There are two stages of conceptual development. In the first stage, the management literature is analysed to explain the Standard Model of dynamic learning capability that emphasises three learning phases for organisations. This Standard Model is extended to derive a novel Circular Model of dynamic learning capability that shows a new feedback loop between performance and learning. In the second stage, the construction management literature is consulted, adding project lifecycle, stakeholder diversity and three organisational levels to the analysis, to arrive at the Collaborative Model of dynamic learning capability. Findings The Collaborative Model should enable construction organisations to successfully adapt and perform under changing market conditions. The complexity of learning cycles results in capabilities that are imperfectly imitable between organisations, explaining performance heterogeneity on projects. Originality/value The Collaborative Model provides a theoretically substantiated description of project performance, driven by the evolution of procurement systems and governance mechanisms. The Model’s empirical value will be tested in future research.
Resumo:
The mining industry presents us with a number of ideal applications for sensor based machine control because of the unstructured environment that exists within each mine. The aim of the research presented here is to increase the productivity of existing large compliant mining machines by retrofitting with enhanced sensing and control technology. The current research focusses on the automatic control of the swing motion cycle of a dragline and an automated roof bolting system. We have achieved: * closed-loop swing control of an one-tenth scale model dragline; * single degree of freedom closed-loop visual control of an electro-hydraulic manipulator in the lab developed from standard components.
Resumo:
This paper documents the longitudinal and reciprocal relations among behavioral sleep problems, emotional and attentional self-regulation in a population sample of 4109 children participating in the Growing Up in Australia: The Longitudinal Study of Australian Children (LSAC) – Infant Cohort. Maternal reports of children’s sleep problems and self-regulation were collected at five time points from infancy to 8-9 years of age. Longitudinal structural equation modeling supported a developmental cascade model in which sleep problems have a persistent negative effect on emotional regulation, which in turn contributes to ongoing sleep problems and poorer attentional regulation in children over time. Findings suggest that sleep behaviors are a key target for interventions that aim to improve children’s self-regulatory capacities.
Resumo:
Electrical impedance tomography is a novel technology capable of quantifying ventilation distribution in the lung in real time during various therapeutic manoeuvres. The technique requires changes to the patient’s position to place the electrical impedance tomography electrodes circumferentially around the thorax. The impact of these position changes on the time taken to stabilise the regional distribution of ventilation determined by electrical impedance tomography is unknown. This study aimed to determine the time taken for the regional distribution of ventilation determined by electrical impedance tomography to stabilise after changing position. Eight healthy, male volunteers were connected to electrical impedance tomography and a pneumotachometer. After 30 minutes stabilisation supine, participants were moved into 60 degrees Fowler’s position and then returned to supine. Thirty minutes was spent in each position. Concurrent readings of ventilation distribution and tidal volumes were taken every five minutes. A mixed regression model with a random intercept was used to compare the positions and changes over time. The anterior-posterior distribution stabilised after ten minutes in Fowler’s position and ten minutes after returning to supine. Left-right stabilisation was achieved after 15 minutes in Fowler’s position and supine. A minimum of 15 minutes of stabilisation should be allowed for spontaneously breathing individuals when assessing ventilation distribution. This time allows stabilisation to occur in the anterior-posterior direction as well as the left-right direction.
Resumo:
Christmas has come early for copyright owners in Australia. The film company, Roadshow, the pay television company Foxtel, and Rupert Murdoch's News Corp and News Limited--as well as copyright industries--have been clamoring for new copyright powers and remedies. In the summer break, the Coalition Government has responded to such entreaties from its industry supporters and donors, with a new package of copyright laws and policies. There has been significant debate over the proposals between the odd couple of Attorney-General George Brandis and the Minister for Communications, Malcolm Turnbull. There have been deep, philosophical differences between the two Ministers over the copyright agenda. The Attorney-General George Brandis has supported a model of copyright maximalism, with strong rights and remedies for the copyright empires in film, television, and publishing. He has shown little empathy for the information technology companies of the digital economy. The Attorney-General has been impatient to press ahead with a copyright regime. The Minister for Communications, Malcolm Turnbull, has been somewhat more circumspect, recognizing that there is a need to ensure that copyright laws do not adversely impact upon competition in the digital economy. The final proposal is a somewhat awkward compromise between the discipline-and-punish regime preferred by Brandis, and the responsive regulation model favored by Turnbull. In his new book, Information Doesn't Want to Be Free: Laws for the Internet Age, Cory Doctorow has some sage advice for copyright owners: Things that don't make money: Complaining about piracy. Calling your customers thieves. Treating your customers like thieves. In this context, the push by copyright owners and the Coalition Government to have a copyright crackdown may well be counter-productive to their interests.