862 resultados para MARKOV MODEL
Resumo:
In this paper, we propose a multivariate GARCH model with a time-varying conditional correlation structure. The new double smooth transition conditional correlation (DSTCC) GARCH model extends the smooth transition conditional correlation (STCC) GARCH model of Silvennoinen and Teräsvirta (2005) by including another variable according to which the correlations change smoothly between states of constant correlations. A Lagrange multiplier test is derived to test the constancy of correlations against the DSTCC-GARCH model, and another one to test for another transition in the STCC-GARCH framework. In addition, other specification tests, with the aim of aiding the model building procedure, are considered. Analytical expressions for the test statistics and the required derivatives are provided. Applying the model to the stock and bond futures data, we discover that the correlation pattern between them has dramatically changed around the turn of the century. The model is also applied to a selection of world stock indices, and we find evidence for an increasing degree of integration in the capital markets.
Resumo:
This paper explores the potential therapeutic role of the naturally occurring sugar heparan sulfate (HS) for the augmentation of bone repair. Scaffolds comprising fibrin glue loaded with 5 lg of embryonically derived HS were assessed, firstly as a release-reservoir, and secondly as a scaffold to stimulate bone regeneration in a critical size rat cranial defect. We show HS-loaded scaffolds have a uniform distribution of HS, which was readily released with a typical burst phase, quickly followed by a prolonged delivery lasting several days. Importantly, the released HS contributed to improved wound healing over a 3-month period as determined by microcomputed tomography (lCT) scanning, histology, histomorphometry, and PCR for osteogenic markers. In all cases, only minimal healing was observed after 1 and 3 months in the absence of HS. In contrast, marked healing was observed by 3 months following HS treatment, with nearly full closure of the defect site. PCR analysis showed significant increases in the gene expression of the osteogenic markers Runx2, alkaline phosphatase, and osteopontin in the heparin sulfate group compared with controls. These results further emphasize the important role HS plays in augmenting wound healing, and its successful delivery in a hydrogel provides a novel alternative to autologous bone graft and growth factorbased therapies.
Resumo:
uring periods of market stress, electricity prices can rise dramatically. Electricity retailers cannot pass these extreme prices on to customers because of retail price regulation. Improved prediction of these price spikes therefore is important for risk management. This paper builds a time-varying-probability Markov-switching model of Queensland electricity prices, aimed particularly at forecasting price spikes. Variables capturing demand and weather patterns are used to drive the transition probabilities. Unlike traditional Markov-switching models that assume normality of the prices in each state, the model presented here uses a generalised beta distribution to allow for the skewness in the distribution of electricity prices during high-price episodes.
Resumo:
As a result of the growing adoption of Business Process Management (BPM) technology different stakeholders need to understand and agree upon the process models that are used to configure BPM systems. However, BPM users have problems dealing with the complexity of such models. Therefore, the challenge is to improve the comprehension of process models. While a substantial amount of literature is devoted to this topic, there is no overview of the various mechanisms that exist to deal with managing complexity in (large) process models. It is thus hard to obtain comparative insight into the degree of support offered for various complexity reducing mechanisms by state-of-the-art languages and tools. This paper focuses on complexity reduction mechanisms that affect the abstract syntax of a process model, i.e. the structure of a process model. These mechanisms are captured as patterns, so that they can be described in their most general form and in a language- and tool-independent manner. The paper concludes with a comparative overview of the degree of support for these patterns offered by state-of-the-art languages and language implementations.
Resumo:
With increasing pressure to provide environmentally responsible infrastructure products and services, stakeholders are putting significant foci on the early identification of financial viability and outcome of infrastructure projects. Traditionally, there has been an imbalance between sustainable measures and project budget. On one hand, the industry tends to employ the first-cost mentality and approach to developing infrastructure projects. On the other, environmental experts and technology innovators often push for the ultimately green products and systems without much of a concern for cost. This situation is being quickly changed as the industry is under pressure to continue to return profit, while better adapting to current and emerging global issues of sustainability. For the infrastructure sector to contribute to sustainable development, it will need to increase value and efficiency. Thus, there is a great need for tools that will enable decision makers evaluate competing initiatives and identify the most sustainable approaches to procuring infrastructure projects. In order to ensure that these objectives are achieved, the concept of life-cycle costing analysis (LCCA) will play significant roles in the economics of an infrastructure project. Recently, a few research initiatives have applied the LCCA models for road infrastructure that focused on the traditional economics of a project. There is little coverage of life-cycle costing as a method to evaluate the criteria and assess the economic implications of pursuing sustainability in road infrastructure projects. To rectify this problem, this paper reviews the theoretical basis of previous LCCA models before discussing their inability to determinate the sustainability indicators in road infrastructure project. It then introduces an on-going research aimed at developing a new model to integrate the various new cost elements based on the sustainability indicators with the traditional and proven LCCA approach. It is expected that the research will generate a working model for sustainability based life-cycle cost analysis.
Resumo:
The evaluation of satisfaction levels related to performance is an important aspect in increasing market share, improving profitability and enlarging opportunities for repeat business and can lead to the determination of areas to be improved, improving harmonious working relationships and conflict avoidance. In the construction industry, this can also result in improved project quality, enhanced reputation and increased competitiveness. Many conceptual models have been developed to measure satisfaction levels - typically to gauge client satisfaction, customer satisfaction and home buyer satisfaction - but limited empirical research has been carried out, especially in investigating the satisfaction of construction contractors. In addressing this, this paper provides a unique conceptual model or framework for contractor satisfaction based on attributes identified by interviews with practitioners in Malaysia. In addition to progressing research in this topic and being of potential benefit to Malaysian contractors, it is anticipated that the framework will also be useful for other parties - clients, designers, subcontractors and suppliers - in enhancing the quality of products and/or services generally.
Towards a generic skills learning model in public relations: student perspectives on self evaluation
Resumo:
Objectives: To explore whether people's organ donation consent decisions occur via a reasoned and/or social reaction pathway. --------- Design: We examined prospectively students' and community members' decisions to register consent on a donor register and discuss organ donation wishes with family. --------- Method: Participants completed items assessing theory of planned behaviour (TPB; attitude, subjective norm, perceived behavioural control (PBC)), prototype/willingness model (PWM; donor prototype favourability/similarity, past behaviour), and proposed additional influences (moral norm, self-identity, recipient prototypes) for registering (N=339) and discussing (N=315) intentions/willingness. Participants self-reported their registering (N=177) and discussing (N=166) behaviour 1 month later. The utility of the (1) TPB, (2) PWM, (3) augmented TPB with PWM, and (4) augmented TPB with PWM and extensions was tested using structural equation modelling for registering and discussing intentions/willingness, and logistic regression for behaviour. --------- Results: While the TPB proved a more parsimonious model, fit indices suggested that the other proposed models offered viable options, explaining greater variance in communication intentions/willingness. The TPB, augmented TPB with PWM, and extended augmented TPB with PWM best explained registering and discussing decisions. The proposed and revised PWM also proved an adequate fit for discussing decisions. Respondents with stronger intentions (and PBC for registering) had a higher likelihood of registering and discussing. --------- Conclusions: People's decisions to communicate donation wishes may be better explained via a reasoned pathway (especially for registering); however, discussing involves more reactive elements. The role of moral norm, self-identity, and prototypes as influences predicting communication decisions were highlighted also.
Resumo:
An adaptive agent improves its performance by learning from experience. This paper describes an approach to adaptation based on modelling dynamic elements of the environment in order to make predictions of likely future state. This approach is akin to an elite sports player being able to “read the play”, allowing for decisions to be made based on predictions of likely future outcomes. Modelling of the agent‟s likely future state is performed using Markov Chains and a technique called “Motion and Occupancy Grids”. The experiments in this paper compare the performance of the planning system with and without the use of this predictive model. The results of the study demonstrate a surprising decrease in performance when using the predictions of agent occupancy. The results are derived from statistical analysis of the agent‟s performance in a high fidelity simulation of a world leading real robot soccer team.
Resumo:
Background It remains unclear over whether it is possible to develop an epidemic forecasting model for transmission of dengue fever in Queensland, Australia. Objectives To examine the potential impact of El Niño/Southern Oscillation on the transmission of dengue fever in Queensland, Australia and explore the possibility of developing a forecast model of dengue fever. Methods Data on the Southern Oscillation Index (SOI), an indicator of El Niño/Southern Oscillation activity, were obtained from the Australian Bureau of Meteorology. Numbers of dengue fever cases notified and the numbers of postcode areas with dengue fever cases between January 1993 and December 2005 were obtained from the Queensland Health and relevant population data were obtained from the Australia Bureau of Statistics. A multivariate Seasonal Auto-regressive Integrated Moving Average model was developed and validated by dividing the data file into two datasets: the data from January 1993 to December 2003 were used to construct a model and those from January 2004 to December 2005 were used to validate it. Results A decrease in the average SOI (ie, warmer conditions) during the preceding 3–12 months was significantly associated with an increase in the monthly numbers of postcode areas with dengue fever cases (β=−0.038; p = 0.019). Predicted values from the Seasonal Auto-regressive Integrated Moving Average model were consistent with the observed values in the validation dataset (root-mean-square percentage error: 1.93%). Conclusions Climate variability is directly and/or indirectly associated with dengue transmission and the development of an SOI-based epidemic forecasting system is possible for dengue fever in Queensland, Australia.
Resumo:
This paper will explore how a general education can contribute successfully to vocational outcomes using both Participatory Action Research (PAR) and Program Theory methodology. The paper will focus on the development aspects of ‘marrying’ vocational and general education including engagement processes, student, teacher, institute and employer preparation and the pathway possibilities that emerge. Successful cases presented include the: Healthy Futures program (pathways into the Health and Allied industries); Accounting Pathways program (simultaneously studying a general Accounting subject and a Certificate III vocational qualification); and Sustainable Sciences initiative (development of a vocational qualification that focuses on the emerging renewable energy industry and is linked to school science programs). The case studies have been selected because they are unique in character and application and can be used as a basis for future program development in other settings or curriculum areas.
Resumo:
This thesis addresses computational challenges arising from Bayesian analysis of complex real-world problems. Many of the models and algorithms designed for such analysis are ‘hybrid’ in nature, in that they are a composition of components for which their individual properties may be easily described but the performance of the model or algorithm as a whole is less well understood. The aim of this research project is to after a better understanding of the performance of hybrid models and algorithms. The goal of this thesis is to analyse the computational aspects of hybrid models and hybrid algorithms in the Bayesian context. The first objective of the research focuses on computational aspects of hybrid models, notably a continuous finite mixture of t-distributions. In the mixture model, an inference of interest is the number of components, as this may relate to both the quality of model fit to data and the computational workload. The analysis of t-mixtures using Markov chain Monte Carlo (MCMC) is described and the model is compared to the Normal case based on the goodness of fit. Through simulation studies, it is demonstrated that the t-mixture model can be more flexible and more parsimonious in terms of number of components, particularly for skewed and heavytailed data. The study also reveals important computational issues associated with the use of t-mixtures, which have not been adequately considered in the literature. The second objective of the research focuses on computational aspects of hybrid algorithms for Bayesian analysis. Two approaches will be considered: a formal comparison of the performance of a range of hybrid algorithms and a theoretical investigation of the performance of one of these algorithms in high dimensions. For the first approach, the delayed rejection algorithm, the pinball sampler, the Metropolis adjusted Langevin algorithm, and the hybrid version of the population Monte Carlo (PMC) algorithm are selected as a set of examples of hybrid algorithms. Statistical literature shows how statistical efficiency is often the only criteria for an efficient algorithm. In this thesis the algorithms are also considered and compared from a more practical perspective. This extends to the study of how individual algorithms contribute to the overall efficiency of hybrid algorithms, and highlights weaknesses that may be introduced by the combination process of these components in a single algorithm. The second approach to considering computational aspects of hybrid algorithms involves an investigation of the performance of the PMC in high dimensions. It is well known that as a model becomes more complex, computation may become increasingly difficult in real time. In particular the importance sampling based algorithms, including the PMC, are known to be unstable in high dimensions. This thesis examines the PMC algorithm in a simplified setting, a single step of the general sampling, and explores a fundamental problem that occurs in applying importance sampling to a high-dimensional problem. The precision of the computed estimate from the simplified setting is measured by the asymptotic variance of the estimate under conditions on the importance function. Additionally, the exponential growth of the asymptotic variance with the dimension is demonstrated and we illustrates that the optimal covariance matrix for the importance function can be estimated in a special case.
Resumo:
Existing literature has failed to find robust relationships between individual differences and the ability to fake psychological tests, possibly due to limitations in how successful faking is operationalised. In order to fake, individuals must alter their original profile to create a particular impression. Currently, successful faking is operationalised through statistical definitions, informant ratings, known groups comparisons, the use of archival and baseline data, and breaches of validity indexes. However, there are many methodological limitations to these approaches. This research proposed a three component model of successful faking to address this, where an original response is manipulated into a strategic response, which must match a criteria target. Further, by operationalising successful faking in this manner, this research takes into account the fact that individuals may have been successful in reaching their implicitly created profile, but that this may not have matched the criteria they were instructed to fake.Participants (N=48, 22 students and 26 non-students) completed the BDI-II honestly. Participants then faked the BDI-II as if they had no, mild, moderate and severe depression, as well as completing a checklist revealing which symptoms they thought indicated each level of depression. Findings were consistent with a three component model of successful faking, where individuals effectively changed their profile to what they believed was required, however this profile differed from the criteria defined by the psychometric norms of the test.One of the foremost issues for research in this area is the inconsistent manner in which successful faking is operationalised. This research allowed successful faking to be operationalised in an objective, quantifiable manner. Using this model as a template may allow researchers better understanding of the processes involved in faking, including the role of strategies and abilities in determining the outcome of test dissimulation.
Resumo:
For some time there has been a growing awareness of organizational culture and its impact on the functioning of engineering and maintenance departments. Those wishing to implement contemporary maintenance regimes (e.g. condition based maintenance) are often encouraged to develop “appropriate cultures” to support a new method’s introduction. Unfortunately these same publications often fail to specifically articulate the cultural values required to support those efforts. In the broader literature, only a limited number of case examples document the cultural values held by engineering asset intensive firms and how they contribute to their success (or failure). Consequently a gap exists in our knowledge of what engineering cultures currently might look like, or what might constitute a best practice engineering asset culture. The findings of a pilot study investigating the perceived ideal characteristics of engineering asset cultures are reported. Engineering managers, consultants and academics (n=47), were surveyed as to what they saw were essential attributes of both engineering cultures and engineering asset personnel. Valued cultural elements included those orientated around continuous improvement, safety and quality. Valued individual attributes included openness to change, interpersonal skills and conscientiousness. The paper concludes with a discussion regarding the development of a best practice cultural framework for practitioners and engineering managers.
Resumo:
Background and purpose Our aim was to prove in an animal model that the use of HA paste at the cement-bone interface in the acetabulum would improve fixation. We examined, in sheep, the effect of interposing a layer of hydroxyapatite cement around the periphery of a polyethylene socket prior to fixing it using polymethylemethacrylate (PMMA). Methods We made a randomized study involving 22 sheep to test whether the application of BoneSource hydroxyapatite material to the surface of the ovine acetabulum prior to cementing a polyethylene cup at hip arthroplasty improved the fixation and the nature of the interface. We studied the gross radiographical appearance of the implant-bone interface and the histological appearance at the interface. Results There were more radiolucencies evident in the control group. Histologically, only sheep randomized into the BoneSource group exhibited a fully osseointegrated interface. Use of the hydroxyapatite material did not confer any detrimental effects. In some cases the material appeared to have been fully resorbed. When the material was evident on histological section, it was incorporated into an osseointegrated interface. There was no giant cell reaction present in any case. There was no evidence of migration of BoneSource to the articulation. Interpretation The application of HA material prior to cementation of a socket produced an improved interface. The technique may be useful in man with to extend the longevity of the cemented implant by protecting the socket interface from the effect of hydrodynamic fluid flow and particulate debris.