890 resultados para requirement model
em Queensland University of Technology - ePrints Archive
Resumo:
Business Process Management (BPM) has been identified as the number one business priority by a recent Gartner study (Gartner, 2005). However, BPM has a plethora of facets as its origins are in Business Process Reengineering, Process Innovation, Process Modelling, and Workflow Management to name a few. Organisations increasingly recognize the requirement for an increased process orientation and require appropriate comprehensive frameworks, which help to scope and evaluate their BPM initiative. This research project aims toward the development of a holistic and widely accepted BPM maturity model, which facilitates the assessment of BPM capabilities. This paper provides an overview about the current model with a focus on the actual model development utilizing a series of Delphi studies. The development process includes separate studies that focus on further defining and expanding the six core factors within the model, i.e. strategic alignment, governance, method, Information Technology, people and culture.
Resumo:
With the advent of Service Oriented Architecture, Web Services have gained tremendous popularity. Due to the availability of a large number of Web services, finding an appropriate Web service according to the requirement of the user is a challenge. This warrants the need to establish an effective and reliable process of Web service discovery. A considerable body of research has emerged to develop methods to improve the accuracy of Web service discovery to match the best service. The process of Web service discovery results in suggesting many individual services that partially fulfil the user’s interest. By considering the semantic relationships of words used in describing the services as well as the use of input and output parameters can lead to accurate Web service discovery. Appropriate linking of individual matched services should fully satisfy the requirements which the user is looking for. This research proposes to integrate a semantic model and a data mining technique to enhance the accuracy of Web service discovery. A novel three-phase Web service discovery methodology has been proposed. The first phase performs match-making to find semantically similar Web services for a user query. In order to perform semantic analysis on the content present in the Web service description language document, the support-based latent semantic kernel is constructed using an innovative concept of binning and merging on the large quantity of text documents covering diverse areas of domain of knowledge. The use of a generic latent semantic kernel constructed with a large number of terms helps to find the hidden meaning of the query terms which otherwise could not be found. Sometimes a single Web service is unable to fully satisfy the requirement of the user. In such cases, a composition of multiple inter-related Web services is presented to the user. The task of checking the possibility of linking multiple Web services is done in the second phase. Once the feasibility of linking Web services is checked, the objective is to provide the user with the best composition of Web services. In the link analysis phase, the Web services are modelled as nodes of a graph and an allpair shortest-path algorithm is applied to find the optimum path at the minimum cost for traversal. The third phase which is the system integration, integrates the results from the preceding two phases by using an original fusion algorithm in the fusion engine. Finally, the recommendation engine which is an integral part of the system integration phase makes the final recommendations including individual and composite Web services to the user. In order to evaluate the performance of the proposed method, extensive experimentation has been performed. Results of the proposed support-based semantic kernel method of Web service discovery are compared with the results of the standard keyword-based information-retrieval method and a clustering-based machine-learning method of Web service discovery. The proposed method outperforms both information-retrieval and machine-learning based methods. Experimental results and statistical analysis also show that the best Web services compositions are obtained by considering 10 to 15 Web services that are found in phase-I for linking. Empirical results also ascertain that the fusion engine boosts the accuracy of Web service discovery by combining the inputs from both the semantic analysis (phase-I) and the link analysis (phase-II) in a systematic fashion. Overall, the accuracy of Web service discovery with the proposed method shows a significant improvement over traditional discovery methods.
Resumo:
Minimizing complexity of group key exchange (GKE) protocols is an important milestone towards their practical deployment. An interesting approach to achieve this goal is to simplify the design of GKE protocols by using generic building blocks. In this paper we investigate the possibility of founding GKE protocols based on a primitive called multi key encapsulation mechanism (mKEM) and describe advantages and limitations of this approach. In particular, we show how to design a one-round GKE protocol which satisfies the classical requirement of authenticated key exchange (AKE) security, yet without forward secrecy. As a result, we obtain the first one-round GKE protocol secure in the standard model. We also conduct our analysis using recent formal models that take into account both outsider and insider attacks as well as the notion of key compromise impersonation resilience (KCIR). In contrast to previous models we show how to model both outsider and insider KCIR within the definition of mutual authentication. Our analysis additionally implies that the insider security compiler by Katz and Shin from ACM CCS 2005 can be used to achieve more than what is shown in the original work, namely both outsider and insider KCIR.
Resumo:
In professions such as teaching, health sciences (medicine, nursing, allied health), and built environment (engineering), significant work-based learning through practica is an essential element before graduation. However, there is no such requirement in Accountancy. This thesis reports the findings of a qualitative case study of the development and implementation of a Workplace Learning Experience Program in Accountancy at the Queensland University of Technology (QUT) in Australia. The case study of this intervention, based on sociocultural learning theory, provides the grounds for the development of a new model of teaching and learning for accounting education. The survey and interview-based study documents the responses of two cohorts of university students and a group of employers to a work placement program. The study demonstrates that a 100 hour work placement in Accountancy has elements that enhance student learning. It demonstrates the potential value of the application of sociocultural theories of learning, especially the concept of situated learning involving legitimate peripheral participation (Lave & Wenger, 1991). This research establishes the theoretical base for a paradigm shift for the Accountancy profession to acknowledge work placements prior to graduation as a major element of learning. It is argued that the current model of accounting education requires reform to better align university and workplace learning.
Resumo:
Background The bisphosphonate, zoledronic acid (ZOL), can inhibit osteoclasts leading to decreased osteoclastogenesis and osteoclast activity in bone. Here, we used a mixed osteolytic/osteoblastic murine model of bone-metastatic prostate cancer, RM1(BM), to determine how inhibiting osteolysis with ZOL affects the ability of these cells to establish metastases in bone, the integrity of the tumour-bearing bones and the survival of the tumour-bearing mice. Methods The model involves intracardiac injection for arterial dissemination of the RM1(BM) cells in C57BL/6 mice. ZOL treatment was given via subcutaneous injections on days 0, 4, 8 and 12, at 20 and 100 µg/kg doses. Bone integrity was assessed by micro-computed tomography and histology with comparison to untreated mice. The osteoclast and osteoblast activity was determined by measuring serum tartrate-resistant acid phosphatase 5b (TRAP 5b) and osteocalcin, respectively. Mice were euthanased according to predetermined criteria and survival was assessed using Kaplan Meier plots. Findings Micro-CT and histological analysis showed that treatment of mice with ZOL from the day of intracardiac injection of RM1(BM) cells inhibited tumour-induced bone lysis, maintained bone volume and reduced the calcification of tumour-induced endochondral osteoid material. ZOL treatment also led to a decreased serum osteocalcin and TRAP 5b levels. Additionally, treated mice showed increased survival compared to vehicle treated controls. However, ZOL treatment did not inhibit the cells ability to metastasise to bone as the number of bone-metastases was similar in both treated and untreated mice. Conclusions ZOL treatment provided significant benefits for maintaining the integrity of tumour-bearing bones and increased the survival of tumour bearing mice, though it did not prevent establishment of bone-metastases in this model. From the mechanistic view, these observations confirm that tumour-induced bone lysis is not a requirement for establishment of these bone tumours.
Resumo:
The ability to estimate the asset reliability and the probability of failure is critical to reducing maintenance costs, operation downtime, and safety hazards. Predicting the survival time and the probability of failure in future time is an indispensable requirement in prognostics and asset health management. In traditional reliability models, the lifetime of an asset is estimated using failure event data, alone; however, statistically sufficient failure event data are often difficult to attain in real-life situations due to poor data management, effective preventive maintenance, and the small population of identical assets in use. Condition indicators and operating environment indicators are two types of covariate data that are normally obtained in addition to failure event and suspended data. These data contain significant information about the state and health of an asset. Condition indicators reflect the level of degradation of assets while operating environment indicators accelerate or decelerate the lifetime of assets. When these data are available, an alternative approach to the traditional reliability analysis is the modelling of condition indicators and operating environment indicators and their failure-generating mechanisms using a covariate-based hazard model. The literature review indicates that a number of covariate-based hazard models have been developed. All of these existing covariate-based hazard models were developed based on the principle theory of the Proportional Hazard Model (PHM). However, most of these models have not attracted much attention in the field of machinery prognostics. Moreover, due to the prominence of PHM, attempts at developing alternative models, to some extent, have been stifled, although a number of alternative models to PHM have been suggested. The existing covariate-based hazard models neglect to fully utilise three types of asset health information (including failure event data (i.e. observed and/or suspended), condition data, and operating environment data) into a model to have more effective hazard and reliability predictions. In addition, current research shows that condition indicators and operating environment indicators have different characteristics and they are non-homogeneous covariate data. Condition indicators act as response variables (or dependent variables) whereas operating environment indicators act as explanatory variables (or independent variables). However, these non-homogenous covariate data were modelled in the same way for hazard prediction in the existing covariate-based hazard models. The related and yet more imperative question is how both of these indicators should be effectively modelled and integrated into the covariate-based hazard model. This work presents a new approach for addressing the aforementioned challenges. The new covariate-based hazard model, which termed as Explicit Hazard Model (EHM), explicitly and effectively incorporates all three available asset health information into the modelling of hazard and reliability predictions and also drives the relationship between actual asset health and condition measurements as well as operating environment measurements. The theoretical development of the model and its parameter estimation method are demonstrated in this work. EHM assumes that the baseline hazard is a function of the both time and condition indicators. Condition indicators provide information about the health condition of an asset; therefore they update and reform the baseline hazard of EHM according to the health state of asset at given time t. Some examples of condition indicators are the vibration of rotating machinery, the level of metal particles in engine oil analysis, and wear in a component, to name but a few. Operating environment indicators in this model are failure accelerators and/or decelerators that are included in the covariate function of EHM and may increase or decrease the value of the hazard from the baseline hazard. These indicators caused by the environment in which an asset operates, and that have not been explicitly identified by the condition indicators (e.g. Loads, environmental stresses, and other dynamically changing environment factors). While the effects of operating environment indicators could be nought in EHM; condition indicators could emerge because these indicators are observed and measured as long as an asset is operational and survived. EHM has several advantages over the existing covariate-based hazard models. One is this model utilises three different sources of asset health data (i.e. population characteristics, condition indicators, and operating environment indicators) to effectively predict hazard and reliability. Another is that EHM explicitly investigates the relationship between condition and operating environment indicators associated with the hazard of an asset. Furthermore, the proportionality assumption, which most of the covariate-based hazard models suffer from it, does not exist in EHM. According to the sample size of failure/suspension times, EHM is extended into two forms: semi-parametric and non-parametric. The semi-parametric EHM assumes a specified lifetime distribution (i.e. Weibull distribution) in the form of the baseline hazard. However, for more industry applications, due to sparse failure event data of assets, the analysis of such data often involves complex distributional shapes about which little is known. Therefore, to avoid the restrictive assumption of the semi-parametric EHM about assuming a specified lifetime distribution for failure event histories, the non-parametric EHM, which is a distribution free model, has been developed. The development of EHM into two forms is another merit of the model. A case study was conducted using laboratory experiment data to validate the practicality of the both semi-parametric and non-parametric EHMs. The performance of the newly-developed models is appraised using the comparison amongst the estimated results of these models and the other existing covariate-based hazard models. The comparison results demonstrated that both the semi-parametric and non-parametric EHMs outperform the existing covariate-based hazard models. Future research directions regarding to the new parameter estimation method in the case of time-dependent effects of covariates and missing data, application of EHM in both repairable and non-repairable systems using field data, and a decision support model in which linked to the estimated reliability results, are also identified.
Resumo:
Moderation of student assessment is a critical component of teaching and learning in contemporary universities. In Australia, moderation is mandated through university policies and through the new national university accreditation authority, Tertiary Education Quality and Standards Agency which began operations in late January 2012 (TEQSA, 2012). The TEQSA requirement to declare details of moderation and any other arrangements used to support consistency and reliability of assessment and grading across each subject in the course of study is a radical step intended to move toward heightened accountability and greater transparency in the tertiary sector as well as entrenching evidence-based practice in the management of Australian academic programs. In light of this reform, the purpose of this project was to investigate and analyse current moderation practices operating within a faculty of education at a large urban university in Queensland, Australia. This qualitative study involved interviews with the unit coordinators (n=21) and tutors (n=8) of core undergraduate education units and graduate diploma units within the faculty. Four distinct discourses of moderation that academics drew on to discuss their practices were identified in the study. These were: equity, justification, community building, and accountability. These discourses, together with recommendations for changes to moderation practices are discussed in this paper.
Resumo:
Specialist scholarly books, including monographs, allow researchers to present their work, pose questions and to test and extend areas of theory through long-form writing. In spite of the fact that research communities all over the world value monographs and depend heavily on them as a requirement of tenure and promotion in many disciplines, sales of this kind of book are in free fall, with some estimates suggesting declines of as much as 90% over twenty years (Willinsky 2006). Cashstrapped monograph publishers have found themselves caught in a negative cycle of increasing prices and falling sales, with few resources left to support experimentation, business model innovation or engagement with digital technology and Open Access (OA). This chapter considers an important attempt to tackle failing markets for scholarly monographs, and to enable the wider adoption of OA licenses for book-length works: the 2012 – 2014 Knowledge Unlatched pilot. Knowledge Unlatched is a bold attempt to reconfigure the market for specialist scholarly books: moving it beyond the sale of ‘content’ towards a model that supports the services valued by scholarly and wider communities in the context of digital possibility. Its success has powerful implications for the way we understand copyright’s role in the creative industries, and the potential for established institutions and infrastructure to support the open and networked dynamics of a digital age.
Resumo:
Fleck and Johnson (Int. J. Mech. Sci. 29 (1987) 507) and Fleck et al. (Proc. Inst. Mech. Eng. 206 (1992) 119) have developed foil rolling models which allow for large deformations in the roll profile, including the possibility that the rolls flatten completely. However, these models require computationally expensive iterative solution techniques. A new approach to the approximate solution of the Fleck et al. (1992) Influence Function Model has been developed using both analytic and approximation techniques. The numerical difficulties arising from solving an integral equation in the flattened region have been reduced by applying an Inverse Hilbert Transform to get an analytic expression for the pressure. The method described in this paper is applicable to cases where there is or there is not a flat region.
Rainfall, Mosquito Density and the Transmission of Ross River Virus: A Time-Series Forecasting Model