107 resultados para variance change point detection
Resumo:
The potential to sequester atmospheric carbon in agricultural and forest soils to offset greenhouse gas emissions has generated interest in measuring changes in soil carbon resulting from changes in land management. However, inherent spatial variability of soil carbon limits the precision of measurement of changes in soil carbon and hence, the ability to detect changes. We analyzed variability of soil carbon by intensively sampling sites under different land management as a step toward developing efficient soil sampling designs. Sites were tilled crop-land and a mixed deciduous forest in Tennessee, and old-growth and second-growth coniferous forest in western Washington, USA. Six soil cores within each of three microplots were taken as an initial sample and an additional six cores were taken to simulate resampling. Soil C variability was greater in Washington than in Tennessee, and greater in less disturbed than in more disturbed sites. Using this protocol, our data suggest that differences on the order of 2.0 Mg C ha(-1) could be detected by collection and analysis of cores from at least five (tilled) or two (forest) microplots in Tennessee. More spatial variability in the forested sites in Washington increased the minimum detectable difference, but these systems, consisting of low C content sandy soil with irregularly distributed pockets of organic C in buried logs, are likely to rank among the most spatially heterogeneous of systems. Our results clearly indicate that consistent intramicroplot differences at all sites will enable detection of much more modest changes if the same microplots are resampled.
Resumo:
The quick detection of abrupt (unknown) parameter changes in an observed hidden Markov model (HMM) is important in several applications. Motivated by the recent application of relative entropy concepts in the robust sequential change detection problem (and the related model selection problem), this paper proposes a sequential unknown change detection algorithm based on a relative entropy based HMM parameter estimator. Our proposed approach is able to overcome the lack of knowledge of post-change parameters, and is illustrated to have similar performance to the popular cumulative sum (CUSUM) algorithm (which requires knowledge of the post-change parameter values) when examined, on both simulated and real data, in a vision-based aircraft manoeuvre detection problem.
Resumo:
The presence of insect pests in grain storages throughout the supply chain is a significant problem for farmers, grain handlers, and distributors world-wide. Insect monitoring and sampling programmes are used in the stored grains industry for the detection and estimation of pest populations. At the low pest densities dictated by economic and commercial requirements, the accuracy of both detection and abundance estimates can be influenced by variations in the spatial structure of pest populations over short distances. Geostatistical analysis of Rhyzopertha dominica populations in 2 and 3 dimensions showed that insect numbers were positively correlated over short (0.5 cm) distances, and negatively correlated over longer (.10 cm) distances. At 35 C, insects were located significantly further from the grain surface than at 25 and 30 C. Dispersion metrics showed statistically significant aggregation in all cases. The observed heterogeneous spatial distribution of R. dominica may also be influenced by factors such as the site of initial infestation and disturbance during handling. To account for these additional factors, I significantly extended a simulation model that incorporates both pest growth and movement through a typical stored-grain supply chain. By incorporating the effects of abundance, initial infestation site, grain handling, and treatment on pest spatial distribution, I developed a supply chain model incorporating estimates of pest spatial distribution. This was used to examine several scenarios representative of grain movement through a supply chain, and determine the influence of infestation location and grain disturbance on the sampling intensity required to detect pest infestations at various infestation rates. This study has investigated the effects of temperature, infestation point, and grain handling on the spatial distribution and detection of R. dominica. The proportion of grain infested was found to be dependent upon abundance, initial pest location, and grain handling. Simulation modelling indicated that accounting for these factors when developing sampling strategies for stored grain has the potential to significantly reduce sampling costs while simultaneously improving detection rate, resulting in reduced storage and pest management cost while improving grain quality.
Resumo:
The quick detection of an abrupt unknown change in the conditional distribution of a dependent stochastic process has numerous applications. In this paper, we pose a minimax robust quickest change detection problem for cases where there is uncertainty about the post-change conditional distribution. Our minimax robust formulation is based on the popular Lorden criteria of optimal quickest change detection. Under a condition on the set of possible post-change distributions, we show that the widely known cumulative sum (CUSUM) rule is asymptotically minimax robust under our Lorden minimax robust formulation as a false alarm constraint becomes more strict. We also establish general asymptotic bounds on the detection delay of misspecified CUSUM rules (i.e. CUSUM rules that are designed with post- change distributions that differ from those of the observed sequence). We exploit these bounds to compare the delay performance of asymptotically minimax robust, asymptotically optimal, and other misspecified CUSUM rules. In simulation examples, we illustrate that asymptotically minimax robust CUSUM rules can provide better detection delay performance at greatly reduced computation effort compared to competing generalised likelihood ratio procedures.
Resumo:
We propose the use of optical flow information as a method for detecting and describing changes in the environment, from the perspective of a mobile camera. We analyze the characteristics of the optical flow signal and demonstrate how robust flow vectors can be generated and used for the detection of depth discontinuities and appearance changes at key locations. To successfully achieve this task, a full discussion on camera positioning, distortion compensation, noise filtering, and parameter estimation is presented. We then extract statistical attributes from the flow signal to describe the location of the scene changes. We also employ clustering and dominant shape of vectors to increase the descriptiveness. Once a database of nodes (where a node is a detected scene change) and their corresponding flow features is created, matching can be performed whenever nodes are encountered, such that topological localization can be achieved. We retrieve the most likely node according to the Mahalanobis and Chi-square distances between the current frame and the database. The results illustrate the applicability of the technique for detecting and describing scene changes in diverse lighting conditions, considering indoor and outdoor environments and different robot platforms.
Resumo:
Background: It is important for nutrition intervention in malnourished patients to be guided by accurate evaluation and detection of small changes in the patient’s nutrition status over time. However, the current Subjective Global Assessment (SGA) is not able to detect changes in a short period of time. The aim of the study was to determine whether 7-point SGA is more time sensitive to nutrition changes than the conventional SGA. Methods: In this prospective study, 67 adult inpatients assessed as malnourished using both the 7-point SGA and conventional SGA were recruited. Each patient received nutrition intervention and was followed up post-discharge. Patients were reassessed using both tools at 1, 3 and 5 months from baseline assessment. Results: It took significantly shorter time to see a one-point change using 7-point SGA compared to conventional SGA (median: 1 month vs. 3 months, p = 0.002). The likelihood of at least a one-point change is 6.74 times greater in 7-point SGA compared to conventional SGA after controlling for age, gender and medical specialties (odds ratio = 6.74, 95% CI 2.88-15.80, p<0.001). Fifty-six percent of patients who had no change in SGA score had changes detected using 7-point SGA. The level of agreement was 100% (k = 1, p < 0.001) between 7-point SGA and 3-point SGA and 83% (k=0.726, p<0.001) between two blinded assessors for 7-point SGA. Conclusion: The 7-point SGA is more time sensitive in its response to nutrition changes than conventional SGA. It can be used to guide nutrition intervention for patients.
Resumo:
Stochastic (or random) processes are inherent to numerous fields of human endeavour including engineering, science, and business and finance. This thesis presents multiple novel methods for quickly detecting and estimating uncertainties in several important classes of stochastic processes. The significance of these novel methods is demonstrated by employing them to detect aircraft manoeuvres in video signals in the important application of autonomous mid-air collision avoidance.
Resumo:
Brisbane Water (BW), a commercialised business arm of Brisbane City Council (BCC) entered into an alliance with a number of organisations from the private sector in order to design, construct, commission and undertake upgrades to three existing wastewater treatment plants located at Sandgate, Oxley Creek, and Wacol in Brisbane. The alliance project is called the Brisbane Water Environmental Alliance (BWEA). This report details the efforts of a team of researchers from the School of Management at Queensland University of Technology to investigate this alliance. This is the second report on this project, and is called Stage 2 of the research. At the time that Stage 2 of the research project was conducted, the BWEA project was nearing completion with a further 8 months remaining before project completion. The aim of this report is to explore individuals’ perceptions of the effectiveness and functioning of the BWEA project in the latter stages of the project. The second aim of this report is to analyse the longitudinal findings of this research project by integrating the findings from Stage 1 and Stage 2 of the project. This long-term analysis of the functioning and effectiveness of the alliance is important because at the current time, researchers have little knowledge of the group developmental processes that occur in large-scale alliances over time. Stage 2 of this research project has a number of aims including assessing performance of the BWEA project from the point of view of a range of stakeholders including the alliance board and alliance management team, alliance staff, and key stakeholders from the client organisation (Brisbane Water). Data were collected using semi-structured interviews with 18 individuals including two board members, one external facilitator, and four staff members from the client organisation. Analysis involved coding the interview transcripts in terms of the major issues that were reported by interviewees.
Resumo:
Building Information Modelling (BIM) is an IT enabled technology that allows storage, management, sharing, access, update and use of all the data relevant to a project through out the project life-cycle in the form of a data repository. BIM enables improved inter-disciplinary collaboration across distributed teams, intelligent documentation and information retrieval, greater consistency in building data, better conflict detection and enhanced facilities management. While the technology itself may not be new, and similar approaches have been in use in some other sectors like Aircraft and Automobile industry for well over a decade now, the AEC/FM (Architecture, Engineering and Construction/ Facilities Management) industry is still to catch up with them in its ability to exploit the benefits of the IT revolution. Though the potential benefits of the technology in terms of knowledge sharing, project management, project co-ordination and collaboration are near to obvious, the adoption rate has been rather lethargic, inspite of some well directed efforts and availability of supporting commercial tools. Since the technology itself has been well tested over the years in some other domains the plausible causes must be rooted well beyond the explanation of the ‘Bell Curve of innovation adoption’. This paper discusses the preliminary findings of an ongoing research project funded by the Cooperative Research Centre for Construction Innovation (CRC-CI) which aims to identify these gaps and come up with specifications and guidelines to enable greater adoption of the BIM approach in practice. A detailed literature review is conducted that looks at some of the similar research reported in the recent years. A desktop audit of some of the existing commercial tools that support BIM application has been conducted to identify the technological issues and concerns, and a workshop was organized with industry partners and various players in the AEC industry for needs analysis, expectations and feedback on the possible deterrents and inhibitions surrounding the BIM adoption.
Resumo:
Despite changes in surgical techniques, radiotherapy targeting and the apparent earlier detection of cancers, secondary lymphoedema is still a significant problem for about 20–30% of those who receive treatment for cancer, although the incidence and prevalence does seem to be falling. The figures above generally relate to detection of an enlarged limb or other area, but it seems that about 60% of all patients also suffer other problems with how the limb feels, what can or cannot be done with it and a range of social or psychological issues. Often these ‘subjective’ changes occur before the objective ones, such as a change in arm volume or circumference. For most of those treated for cancer lymphoedema does not develop immediately, and, while about 60–70% develop it in the first few years, some do not develop lymphoedema for up to 15 or 20 years. Those who will develop clinically manifest lymphoedema in the future are, for some time, in a latent or hidden phase of lymphoedema. There also seems to be some risk factors which are indicators for a higher likelihood of lymphoedema post treatment, including oedema at the surgical site, arm dominance, age, skin conditions, and body mass index (BMI).
Resumo:
Innovation Management (IM) in most knowledge based firms is used on an adhoc basis where senior managers use this term to leverage competitive edge without understanding its true meaning and how its robust application in organisation impacts organisational performance. There have been attempts in the manufacturing industry to harness the innovative potential of the business and apprehend its use as a point of difference to improve financial and non financial outcomes. However further work is required to innovatively extrapolate the lessons learnt to introduce incremental and/or radical innovation to knowledge based firms. An international structural engineering firm has been proactive in exploring and implementing this idea and has forged an alliance with the Queensland University of Technology to start the Innovation Management Program (IMP). The aim was to develop a permanent and sustainable program with which innovation can be woven through the fabric of the organisation. There was an intention to reinforce the firms’ vision and reinvigorate ideas and create new options that help in its realisation. This paper outlines the need for innovation in knowledge based firms and how this consulting engineering firm reacted to this exigency. The development of the Innovation Management Program, its different themes (and associated projects) and how they integrate to form a holistic model is also discussed. The model is designed around the need of providing professional qualification improvement opportunities for staff, setting-up organised, structured & easily accessible knowledge repositories to capture tacit and explicit knowledge and implement efficient project management strategies with a view to enhance client satisfaction. A Delphi type workshop is used to confirm the themes and projects. Some of the individual projects and their expected outcomes are also discussed. A questionnaire and interviews were used to collect data to select appropriate candidates responsible for leading these projects. Following an in-depth analysis of preliminary research results, some recommendations on the selection process will also be presented.
Resumo:
Surveillance networks are typically monitored by a few people, viewing several monitors displaying the camera feeds. It is then very difficult for a human operator to effectively detect events as they happen. Recently, computer vision research has begun to address ways to automatically process some of this data, to assist human operators. Object tracking, event recognition, crowd analysis and human identification at a distance are being pursued as a means to aid human operators and improve the security of areas such as transport hubs. The task of object tracking is key to the effective use of more advanced technologies. To recognize an event people and objects must be tracked. Tracking also enhances the performance of tasks such as crowd analysis or human identification. Before an object can be tracked, it must be detected. Motion segmentation techniques, widely employed in tracking systems, produce a binary image in which objects can be located. However, these techniques are prone to errors caused by shadows and lighting changes. Detection routines often fail, either due to erroneous motion caused by noise and lighting effects, or due to the detection routines being unable to split occluded regions into their component objects. Particle filters can be used as a self contained tracking system, and make it unnecessary for the task of detection to be carried out separately except for an initial (often manual) detection to initialise the filter. Particle filters use one or more extracted features to evaluate the likelihood of an object existing at a given point each frame. Such systems however do not easily allow for multiple objects to be tracked robustly, and do not explicitly maintain the identity of tracked objects. This dissertation investigates improvements to the performance of object tracking algorithms through improved motion segmentation and the use of a particle filter. A novel hybrid motion segmentation / optical flow algorithm, capable of simultaneously extracting multiple layers of foreground and optical flow in surveillance video frames is proposed. The algorithm is shown to perform well in the presence of adverse lighting conditions, and the optical flow is capable of extracting a moving object. The proposed algorithm is integrated within a tracking system and evaluated using the ETISEO (Evaluation du Traitement et de lInterpretation de Sequences vidEO - Evaluation for video understanding) database, and significant improvement in detection and tracking performance is demonstrated when compared to a baseline system. A Scalable Condensation Filter (SCF), a particle filter designed to work within an existing tracking system, is also developed. The creation and deletion of modes and maintenance of identity is handled by the underlying tracking system; and the tracking system is able to benefit from the improved performance in uncertain conditions arising from occlusion and noise provided by a particle filter. The system is evaluated using the ETISEO database. The dissertation then investigates fusion schemes for multi-spectral tracking systems. Four fusion schemes for combining a thermal and visual colour modality are evaluated using the OTCBVS (Object Tracking and Classification in and Beyond the Visible Spectrum) database. It is shown that a middle fusion scheme yields the best results and demonstrates a significant improvement in performance when compared to a system using either mode individually. Findings from the thesis contribute to improve the performance of semi-automated video processing and therefore improve security in areas under surveillance.
Resumo:
In what follows, I put forward an argument for an analytical method for social science that operates at the level of genre. I argue that generic convergence, generic hybridity, and generic instability provide us with a powerful perspectives on changes in political, cultural, and economic relationships, most specifically at the level of institutions. Such a perspective can help us identify the transitional elements, relationships, and trajectories that define the place of our current system in history, thereby grounding our understanding of possible futures.1 In historically contextualising our present with this method, my concern is to indicate possibilities for the future. Systemic contradictions indicate possibility spaces within which systemic change must and will emerge. We live in a system currently dominated by many fully-expressed contradictions, and so in the presence of many possible futures. The contradictions of the current age are expressed most overtly in the public genres of power politics. Contemporary public policy—indeed politics in general-is an excellent focus for any investigation of possible futures, precisely because of its future-oriented function. It is overtly hortatory; it is designed ‘to get people to do things’ (Muntigl in press: 147). There is no point in trying to get people to do things in the past. Consequently, policy discourse is inherently oriented towards creating some future state of affairs (Graham in press), along with concomitant ways of being, knowing, representing, and acting (Fairclough 2000).
Resumo:
This chapter looks at issues of non-stationarity in determining when a transient has occurred and when it is possible to fit a linear model to a non-linear response. The first issue is associated with the detection of loss of damping of power system modes. When some control device such as an SVC fails, the operator needs to know whether the damping of key power system oscillation modes has deteriorated significantly. This question is posed here as an alarm detection problem rather than an identification problem to get a fast detection of a change. The second issue concerns when a significant disturbance has occurred and the operator is seeking to characterize the system oscillation. The disturbance initially is large giving a nonlinear response; this then decays and can then be smaller than the noise level ofnormal customer load changes. The difficulty is one of determining when a linear response can be reliably identified between the non-linear phase and the large noise phase of thesignal. The solution proposed in this chapter uses “Time-Frequency” analysis tools to assistthe extraction of the linear model.