163 resultados para Model-based bootstrap
Resumo:
Process models in organizational collections are typically modeled by the same team and using the same conventions. As such, these models share many characteristic features like size range, type and frequency of errors. In most cases merely small samples of these collections are available due to e.g. the sensitive information they contain. Because of their sizes, these samples may not provide an accurate representation of the characteristics of the originating collection. This paper deals with the problem of constructing collections of process models, in the form of Petri nets, from small samples of a collection for accurate estimations of the characteristics of this collection. Given a small sample of process models drawn from a real-life collection, we mine a set of generation parameters that we use to generate arbitrary-large collections that feature the same characteristics of the original collection. In this way we can estimate the characteristics of the original collection on the generated collections.We extensively evaluate the quality of our technique on various sample datasets drawn from both research and industry.
Resumo:
Numerous econometric models have been proposed for forecasting property market performance, but limited success has been achieved in finding a reliable and consistent model to predict property market movements over a five to ten year timeframe. This research focuses on office rental growth forecasts and overviews many of the office rent models that have evolved over the past 20 years. A model by DiPasquale and Wheaton is selected for testing in the Brisbane, Australia office market. The adaptation of this study did not provide explanatory variables that could assist in developing a reliable, predictive model of office rental growth. In light of this result, the paper suggests a system dynamics framework that includes an econometric model based on historical data as well as user input guidance for the primary variables. The rent forecast outputs would be assessed having regard to market expectations and probability profiling undertaken for use in simulation exercises. The paper concludes with ideas for ongoing research.
Resumo:
This paper presents a group maintenance scheduling case study for a water distributed network. This water pipeline network presents the challenge of maintaining aging pipelines with the associated increases in annual maintenance costs. The case study focuses on developing an effective maintenance plan for the water utility. Current replacement planning is difficult as it needs to balance the replacement needs under limited budgets. A Maintenance Grouping Optimization (MGO) model based on a modified genetic algorithm was utilized to develop an optimum group maintenance schedule over a 20-year cycle. The adjacent geographical distribution of pipelines was used as a grouping criterion to control the searching space of the MGO model through a Judgment Matrix. Based on the optimum group maintenance schedule, the total cost was effectively reduced compared with the schedules without grouping maintenance jobs. This optimum result can be used as a guidance to optimize the current maintenance plan for the water utility.
Resumo:
Gait recognition approaches continue to struggle with challenges including view-invariance, low-resolution data, robustness to unconstrained environments, and fluctuating gait patterns due to subjects carrying goods or wearing different clothes. Although computationally expensive, model based techniques offer promise over appearance based techniques for these challenges as they gather gait features and interpret gait dynamics in skeleton form. In this paper, we propose a fast 3D ellipsoidal-based gait recognition algorithm using a 3D voxel model derived from multi-view silhouette images. This approach directly solves the limitations of view dependency and self-occlusion in existing ellipse fitting model-based approaches. Voxel models are segmented into four components (left and right legs, above and below the knee), and ellipsoids are fitted to each region using eigenvalue decomposition. Features derived from the ellipsoid parameters are modeled using a Fourier representation to retain the temporal dynamic pattern for classification. We demonstrate the proposed approach using the CMU MoBo database and show that an improvement of 15-20% can be achieved over a 2D ellipse fitting baseline.
Resumo:
Signal-degrading speckle is one factor that can reduce the quality of optical coherence tomography images. We demonstrate the use of a hierarchical model-based motion estimation processing scheme based on an affine-motion model to reduce speckle in optical coherence tomography imaging, by image registration and the averaging of multiple B-scans. The proposed technique is evaluated against other methods available in the literature. The results from a set of retinal images show the benefit of the proposed technique, which provides an improvement in signal-to-noise ratio of the square root of the number of averaged images, leading to clearer visual information in the averaged image. The benefits of the proposed technique are also explored in the case of ocular anterior segment imaging.
Resumo:
In this paper we explore the ability of a recent model-based learning technique Receding Horizon Locally Weighted Regression (RH-LWR) useful for learning temporally dependent systems. In particular this paper investigates the application of RH-LWR to learn control of Multiple-input Multiple-output robot systems. RH-LWR is demonstrated through learning joint velocity and position control of a three Degree of Freedom (DoF) rigid body robot.
Resumo:
There is a need for decision support tools that integrate energy simulation into early design in the context of Australian practice. Despite the proliferation of simulation programs in the last decade, there are no ready-to-use applications that cater specifically for the Australian climate and regulations. Furthermore, the majority of existing tools focus on achieving interaction with the design domain through model-based interoperability, and largely overlook the issue of process integration. This paper proposes an energy-oriented design environment that both accommodates the Australian context and provides interactive and iterative information exchanges that facilitate feedback between domains. It then presents the structure for DEEPA, an openly customisable system that couples parametric modelling and energy simulation software as a means of developing a decision support tool to allow designers to rapidly and flexibly assess the performance of early design alternatives. Finally, it discusses the benefits of developing a dynamic and concurrent performance evaluation process that parallels the characteristics and relationships of the design process.
Resumo:
Significant research has demonstrated direct and indirect associations between substance use and sexual behaviour. Substance use is related to sexual risk-taking and HIV seroconversion among some substance-using MSM. It remains unclear what factors mediate or underlie this relationship, and which substances are associated with greater harm. Substance-related expectancies are hypothesised as potential mechanisms. A conceptual model based on social-cognitive theory was tested, which explores the role of demographic factors, substance use, substance-related expectancies and novelty-seeking personality characteristics in predicting unprotected anal intercourse (UAI) while under the influence, across four commonly used substance types. Phase 1, a qualitative study (N = 20), explored how MSM perceive the effects of substance use on their thoughts, feelings and behaviours, including sexual behaviours. Information was attained through discussion and interviews, resulting in the establishment of key themes. Results indicated MSM experience a wide range of reinforcing aspects associated with substance use. General and specific effects were evident across substance types, and were associated with sexual behaviour and sexual risk-taking. Phase 2 consisted of developing a comprehensive profile of substance-related expectancies for MSM (SEP-MSM) regarding alcohol, cannabis, amyl nitrite and stimulants that possessed sound psychometric properties and was appropriate for use among this group. A cross-sectional questionnaire with 249 participants recruited through gay community networks was used to validate these measures, and involved online data collection, participants rating expectancy items and subsequent factor analysis. Results indicated expectancies can be reliably assessed, and predicted substance use patterns. Phase 3 examined demographic factors, substance use, substance-related expectancies, and novelty-seeking traits among another community sample of MSM (N = 277) throughout Australia, in predicting UAI while under the influence. Using a cross-sectional design, participants were recruited through gay community networks and completed online questionnaires. The SEP-MSM, and associated substance use, predicted UAI. This research extends social-cognitive theory regarding sexual behaviour, and advances understanding of the role of expectancies associated with substance use and sexual risk-taking. Future applications of the SEP-MSM in health promotion, prevention, clinical interventions and research are likely to contribute to reducing harm associated with substance-using MSM (e.g., HIV transmission).
Resumo:
Handling information overload online, from the user's point of view is a big challenge, especially when the number of websites is growing rapidly due to growth in e-commerce and other related activities. Personalization based on user needs is the key to solving the problem of information overload. Personalization methods help in identifying relevant information, which may be liked by a user. User profile and object profile are the important elements of a personalization system. When creating user and object profiles, most of the existing methods adopt two-dimensional similarity methods based on vector or matrix models in order to find inter-user and inter-object similarity. Moreover, for recommending similar objects to users, personalization systems use the users-users, items-items and users-items similarity measures. In most cases similarity measures such as Euclidian, Manhattan, cosine and many others based on vector or matrix methods are used to find the similarities. Web logs are high-dimensional datasets, consisting of multiple users, multiple searches with many attributes to each. Two-dimensional data analysis methods may often overlook latent relationships that may exist between users and items. In contrast to other studies, this thesis utilises tensors, the high-dimensional data models, to build user and object profiles and to find the inter-relationships between users-users and users-items. To create an improved personalized Web system, this thesis proposes to build three types of profiles: individual user, group users and object profiles utilising decomposition factors of tensor data models. A hybrid recommendation approach utilising group profiles (forming the basis of a collaborative filtering method) and object profiles (forming the basis of a content-based method) in conjunction with individual user profiles (forming the basis of a model based approach) is proposed for making effective recommendations. A tensor-based clustering method is proposed that utilises the outcomes of popular tensor decomposition techniques such as PARAFAC, Tucker and HOSVD to group similar instances. An individual user profile, showing the user's highest interest, is represented by the top dimension values, extracted from the component matrix obtained after tensor decomposition. A group profile, showing similar users and their highest interest, is built by clustering similar users based on tensor decomposed values. A group profile is represented by the top association rules (containing various unique object combinations) that are derived from the searches made by the users of the cluster. An object profile is created to represent similar objects clustered on the basis of their similarity of features. Depending on the category of a user (known, anonymous or frequent visitor to the website), any of the profiles or their combinations is used for making personalized recommendations. A ranking algorithm is also proposed that utilizes the personalized information to order and rank the recommendations. The proposed methodology is evaluated on data collected from a real life car website. Empirical analysis confirms the effectiveness of recommendations made by the proposed approach over other collaborative filtering and content-based recommendation approaches based on two-dimensional data analysis methods.
Resumo:
Virtual environments can provide, through digital games and online social interfaces, extremely exciting forms of interactive entertainment. Because of their capability in displaying and manipulating information in natural and intuitive ways, such environments have found extensive applications in decision support, education and training in the health and science domains amongst others. Currently, the burden of validating both the interactive functionality and visual consistency of a virtual environment content is entirely carried out by developers and play-testers. While considerable research has been conducted in assisting the design of virtual world content and mechanics, to date, only limited contributions have been made regarding the automatic testing of the underpinning graphics software and hardware. The aim of this thesis is to determine whether the correctness of the images generated by a virtual environment can be quantitatively defined, and automatically measured, in order to facilitate the validation of the content. In an attempt to provide an environment-independent definition of visual consistency, a number of classification approaches were developed. First, a novel model-based object description was proposed in order to enable reasoning about the color and geometry change of virtual entities during a play-session. From such an analysis, two view-based connectionist approaches were developed to map from geometry and color spaces to a single, environment-independent, geometric transformation space; we used such a mapping to predict the correct visualization of the scene. Finally, an appearance-based aliasing detector was developed to show how incorrectness too, can be quantified for debugging purposes. Since computer games heavily rely on the use of highly complex and interactive virtual worlds, they provide an excellent test bed against which to develop, calibrate and validate our techniques. Experiments were conducted on a game engine and other virtual worlds prototypes to determine the applicability and effectiveness of our algorithms. The results show that quantifying visual correctness in virtual scenes is a feasible enterprise, and that effective automatic bug detection can be performed through the techniques we have developed. We expect these techniques to find application in large 3D games and virtual world studios that require a scalable solution to testing their virtual world software and digital content.
Resumo:
This paper calls for a renewed focus on the teaching of writing. It proposes a conceptual model, based on a social realist perspective, which takes account of the ways in which teachers reflexively mediate personal, professional and political considerations in enacting their writing pedagogies. This model extends understanding of the factors contextualising the teaching of writing. It also provides a useful guide for research into the teaching of writing and a prompt for reflexivity in professional development.
Resumo:
The growth of solid tumours beyond a critical size is dependent upon angiogenesis, the formation of new blood vessels from an existing vasculature. Tumours may remain dormant at microscopic sizes for some years before switching to a mode in which growth of a supportive vasculature is initiated. The new blood vessels supply nutrients, oxygen, and access to routes by which tumour cells may travel to other sites within the host (metastasize). In recent decades an abundance of biological research has focused on tumour-induced angiogenesis in the hope that treatments targeted at the vasculature may result in a stabilisation or regression of the disease: a tantalizing prospect. The complex and fascinating process of angiogenesis has also attracted the interest of researchers in the field of mathematical biology, a discipline that is, for mathematics, relatively new. The challenge in mathematical biology is to produce a model that captures the essential elements and critical dependencies of a biological system. Such a model may ultimately be used as a predictive tool. In this thesis we examine a number of aspects of tumour-induced angiogenesis, focusing on growth of the neovasculature external to the tumour. Firstly we present a one-dimensional continuum model of tumour-induced angiogenesis in which elements of the immune system or other tumour-cytotoxins are delivered via the newly formed vessels. This model, based on observations from experiments by Judah Folkman et al., is able to show regression of the tumour for some parameter regimes. The modelling highlights a number of interesting aspects of the process that may be characterised further in the laboratory. The next model we present examines the initiation positions of blood vessel sprouts on an existing vessel, in a two-dimensional domain. This model hypothesises that a simple feedback inhibition mechanism may be used to describe the spacing of these sprouts with the inhibitor being produced by breakdown of the existing vessel's basement membrane. Finally, we have developed a stochastic model of blood vessel growth and anastomosis in three dimensions. The model has been implemented in C++, includes an openGL interface, and uses a novel algorithm for calculating proximity of the line segments representing a growing vessel. This choice of programming language and graphics interface allows for near-simultaneous calculation and visualisation of blood vessel networks using a contemporary personal computer. In addition the visualised results may be transformed interactively, and drop-down menus facilitate changes in the parameter values. Visualisation of results is of vital importance in the communication of mathematical information to a wide audience, and we aim to incorporate this philosophy in the thesis. As biological research further uncovers the intriguing processes involved in tumourinduced angiogenesis, we conclude with a comment from mathematical biologist Jim Murray, Mathematical biology is : : : the most exciting modern application of mathematics.
Resumo:
This paper proposes a model-based technique for lowering the entrance barrier for service providers to register services with a marketplace broker, such that the service is rapidly configured to utilize the brokerpsilas local service delivery management components. Specifically, it uses process modeling for supporting the execution steps of a service and shows how service delivery functions (e.g. payment points) ldquolocalrdquo to a service broker can be correctly configured into the process model. By formalizing the different operations in a service delivery function (like payment or settlement) and their allowable execution sequences (full payments must follow partial payments), including cross-function dependencies, it shows how through tool support, the non-technical user can quickly configure service delivery functions in a consistent and complete way.
Resumo:
Increasing resistance of rabbits to myxomatosis in Australia has led to the exploration of Rabbit Haemorrhagic Disease, also called Rabbit Calicivirus Disease (RCD) as a possible control agent. While the initial spread of RCD in Australia resulted in widespread rabbit mortality in affected areas, the possible population dynamic effects of RCD and myxomatosis operating within the same system have not been properly explored. Here we present early mathematical modelling examining the interaction between the two diseases. In this study we use a deterministic compartment model, based on the classical SIR model in infectious disease modelling. We consider, here, only a single strain of myxomatosis and RCD and neglect latent periods. We also include logistic population growth, with the inclusion of seasonal birth rates. We assume there is no cross-immunity due to either disease. The mathematical model allows for the possibility of both diseases to be simultaneously present in an individual, although results are also presented for the case where co infection is not possible, since co-infection is thought to be rare and questions exist as to whether it can occur. The simulation results of this investigation show that it is a crucial issue and should be part of future field studies. A single simultaneous outbreak of RCD and myxomatosis was simulated, while ignoring natural births and deaths, appropriate for a short timescale of 20 days. Simultaneous outbreaks may be more common in Queensland. For the case where co-infection is not possible we find that the simultaneous presence of myxomatosis in the population suppresses the prevalence of RCD, compared to an outbreak of RCD with no outbreak of myxomatosis, and thus leads to a less effective control of the population. The reason for this is that infection with myxomatosis removes potentially susceptible rabbits from the possibility of infection with RCD (like a vaccination effect). We found that the reduction in the maximum prevalence of RCD was approximately 30% for an initial prevalence of 20% of myxomatosis, for the case where there was no simultaneous outbreak of myxomatosis, but the peak prevalence was only 15% when there was a simultaneous outbreak of myxomatosis. However, this maximum reduction will depend on other parameter values chosen. When co-infection is allowed then this suppression effect does occur but to a lesser degree. This is because the rabbits infected with both diseases reduces the prevalence of myxomatosis. We also simulated multiple outbreaks over a longer timescale of 10 years, including natural population growth rates, with seasonal birth rates and density dependent(logistic) death rates. This shows how both diseases interact with each other and with population growth. Here we obtain sustained outbreaks occurring approximately every two years for the case of a simultaneous outbreak of both diseases but without simultaneous co-infection, with the prevalence varying from 0.1 to 0.5. Without myxomatosis present then the simulation predicts RCD dies out quickly without further introduction from elsewhere. With the possibility of simultaneous co-infection of rabbits, sustained outbreaks are possible but then the outbreaks are less severe and more frequent (approximately yearly). While further model development is needed, our work to date suggests that: 1) the diseases are likely to interact via their impacts on rabbit abundance levels, and 2) introduction of RCD can suppress myxomatosis prevalence. We recommend that further modelling in conjunction with field studies be carried out to further investigate how these two diseases interact in the population.
Resumo:
Load modelling plays an important role in power system dynamic stability assessment. One of the widely used methods in assessing load model impact on system dynamic response is parametric sensitivity analysis. A composite load model-based load sensitivity analysis framework is proposed. It enables comprehensive investigation into load modelling impacts on system stability considering the dynamic interactions between load and system dynamics. The effect of the location of individual as well as patches of composite loads in the vicinity on the sensitivity of the oscillatory modes is investigated. The impact of load composition on the overall sensitivity of the load is also investigated.