491 resultados para Process model
Resumo:
With the advent of Service Oriented Architecture, Web Services have gained tremendous popularity. Due to the availability of a large number of Web services, finding an appropriate Web service according to the requirement of the user is a challenge. This warrants the need to establish an effective and reliable process of Web service discovery. A considerable body of research has emerged to develop methods to improve the accuracy of Web service discovery to match the best service. The process of Web service discovery results in suggesting many individual services that partially fulfil the user’s interest. By considering the semantic relationships of words used in describing the services as well as the use of input and output parameters can lead to accurate Web service discovery. Appropriate linking of individual matched services should fully satisfy the requirements which the user is looking for. This research proposes to integrate a semantic model and a data mining technique to enhance the accuracy of Web service discovery. A novel three-phase Web service discovery methodology has been proposed. The first phase performs match-making to find semantically similar Web services for a user query. In order to perform semantic analysis on the content present in the Web service description language document, the support-based latent semantic kernel is constructed using an innovative concept of binning and merging on the large quantity of text documents covering diverse areas of domain of knowledge. The use of a generic latent semantic kernel constructed with a large number of terms helps to find the hidden meaning of the query terms which otherwise could not be found. Sometimes a single Web service is unable to fully satisfy the requirement of the user. In such cases, a composition of multiple inter-related Web services is presented to the user. The task of checking the possibility of linking multiple Web services is done in the second phase. Once the feasibility of linking Web services is checked, the objective is to provide the user with the best composition of Web services. In the link analysis phase, the Web services are modelled as nodes of a graph and an allpair shortest-path algorithm is applied to find the optimum path at the minimum cost for traversal. The third phase which is the system integration, integrates the results from the preceding two phases by using an original fusion algorithm in the fusion engine. Finally, the recommendation engine which is an integral part of the system integration phase makes the final recommendations including individual and composite Web services to the user. In order to evaluate the performance of the proposed method, extensive experimentation has been performed. Results of the proposed support-based semantic kernel method of Web service discovery are compared with the results of the standard keyword-based information-retrieval method and a clustering-based machine-learning method of Web service discovery. The proposed method outperforms both information-retrieval and machine-learning based methods. Experimental results and statistical analysis also show that the best Web services compositions are obtained by considering 10 to 15 Web services that are found in phase-I for linking. Empirical results also ascertain that the fusion engine boosts the accuracy of Web service discovery by combining the inputs from both the semantic analysis (phase-I) and the link analysis (phase-II) in a systematic fashion. Overall, the accuracy of Web service discovery with the proposed method shows a significant improvement over traditional discovery methods.
Resumo:
Selecting an appropriate business process modelling technique forms an important task within the methodological challenges of a business process management project. While a plethora of available techniques has been developed over the last decades, there is an obvious shortage of well-accepted reference frameworks that can be used to evaluate and compare the capabilities of the different techniques. Academic progress has been made at least in the area of representational analyses that use ontology as a benchmark for such evaluations. This paper reflects on the comprehensive experiences with the application of a model based on the Bunge ontology in this context. A brief overview of the underlying research model characterizes the different steps in such a research project. A comparative summary of previous representational analyses of process modelling techniques over time gives insights into the relative maturity of selected process modelling techniques. Based on these experiences suggestions are made as to where ontology-based representational analyses could be further developed and what limitations are inherent to such analyses.
Resumo:
In architecture courses, instilling a wider understanding of the industry specific representations practiced in the Building Industry is normally done under the auspices of Technology and Science subjects. Traditionally, building industry professionals communicated their design intentions using industry specific representations. Originally these mainly two dimensional representations such as plans, sections, elevations, schedules, etc. were produced manually, using a drawing board. Currently, this manual process has been digitised in the form of Computer Aided Design and Drafting (CADD) or ubiquitously simply CAD. While CAD has significant productivity and accuracy advantages over the earlier manual method, it still only produces industry specific representations of the design intent. Essentially, CAD is a digital version of the drawing board. The tool used for the production of these representations in industry is still mainly CAD. This is also the approach taken in most traditional university courses and mirrors the reality of the situation in the building industry. A successor to CAD, in the form of Building Information Modelling (BIM), is presently evolving in the Construction Industry. CAD is mostly a technical tool that conforms to existing industry practices. BIM on the other hand is revolutionary both as a technical tool and as an industry practice. Rather than producing representations of design intent, BIM produces an exact Virtual Prototype of any building that in an ideal situation is centrally stored and freely exchanged between the project team. Essentially, BIM builds any building twice: once in the virtual world, where any faults are resolved, and finally, in the real world. There is, however, no established model for learning through the use of this technology in Architecture courses. Queensland University of Technology (QUT), a tertiary institution that maintains close links with industry, recognises the importance of equipping their graduates with skills that are relevant to industry. BIM skills are currently in increasing demand throughout the construction industry through the evolution of construction industry practices. As such, during the second half of 2008, QUT 4th year architectural students were formally introduced for the first time to BIM, as both a technology and as an industry practice. This paper will outline the teaching team’s experiences and methodologies in offering a BIM unit (Architectural Technology and Science IV) at QUT for the first time and provide a description of the learning model. The paper will present the results of a survey on the learners’ perspectives of both BIM and their learning experiences as they learn about and through this technology.
Resumo:
Recent decisions of the Family Court of Australian reflect concerns over the adversarial nature of the legal process. The processes and procedures of the judicial system militate against a detailed examination of the issues and rights of the parties in dispute. The limitations of the family law framework are particularly demonstrated in disputes over the custody of children where the Court has tended to neglect the rights and interests of the primary carer. An alternative "unified family court" framework will be examined in which the Court pursues a more active and interventionist approach in the determination of family law disputes.
Resumo:
An earlier CRC-CI project on ‘automatic estimating’ (AE) has shown the key benefit of model-based design methodologies in building design and construction to be the provision of timely quantitative cost evaluations. Furthermore, using AE during design improves design options, and results in improved design turn-around times, better design quality and/or lower costs. However, AEs for civil engineering structures do not exist; and research partners in the CRC-CI expressed interest in exploring the development of such a process. This document reports on these investigations. The central objective of the study was to evaluate the benefits and costs of developing an AE for concrete civil engineering works. By studying existing documents and through interviews with design engineers, contractors and estimators, we have established that current civil engineering practices (mainly roads/bridges) do not use model-based planning/design. Drawings are executed in 2D and only completed at the end of lengthy planning/design project management lifecycle stages. We have also determined that estimating plays two important, but different roles. The first is part of project management (which we have called macro level estimating). Estimating in this domain sets project budgets, controls quality delivery and contains costs. The second role is estimating during planning/design (micro level estimating). The difference between the two roles is that the former is performed at the end of various lifecycle stages, whereas the latter is performed at any suitable time during planning/design.
Resumo:
Principal Topic Small and micro-enterprises are believed to play a significant part in economic growth and poverty allevition in developing countries. However, there are a range of issues that arise when looking at the support required for local enterprise development, the role of micro finance and sustainability. This paper explores the issues associated with the establishment and resourcing of micro-enterprise develoment and proposes a model of sustainable support of enterprise development in very poor developing economies, particularly in Africa. The purpose of this paper is to identify and address the range of issues raised by the literature and empirical research in Africa, regarding micro-finance and small business support, and to develop a model for sustainable support for enterprise development within a particular cultural and economic context. Micro-finance has become big business with a range of models - from those that operate on a strictly business basis to those that come from a philanthropic base. The models used grow from a range of philosophical and cultural perspectives. Entrepreneurship training is provided around the world. Success is often measured by the number involved and the repayment rates - which are very high, largely because of the lending models used. This paper will explore the range of options available and propose a model that can be implemented and evaluated in rapidly changing developing economies. Methodology/Key Propositions The research draws on entrepreneurial and micro-finance literature and empirical research undertaken in Mozambique, which lies along the Indian ocean sea border of Southern Africa. As a result of war and natural disasters over a prolonged period, there is little industry, primary industries are primitive and there is virtually no infrastructure. Mozambique is ranked as one of the poorest countries in the world. The conditions in Mozambique, though not identical, reflect conditions in many other parts of Africa. A numebr of key elements in the development of enterprises in poor countries are explored including: Impact of micro-finance Sustainable models of micro-finance Education and training Capacity building Support mechanisms Impact on poverty, families and the local economy Survival entrepreneurship versus growth entrepreneurship Transitions to the formal sector. Results and Implications The result of this study is the development of a model for providing intellectual and financial resources to micro-entrepreneurs in poor developing countries in a sustainable way. The model provides a base for ongoing research into the process of entrepreneurial growth in African developing economies. The research raises a numeber of issues regarding sustainability including the nature of the donor/recipient relationship, access to affordable resources, the impact of individual entrepreneurial activity on the local economny and the need for ongoing research to understand the whole process and its impact, intended and unintended.
Resumo:
Key topics: Since the birth of the Open Source movement in the mid-80's, open source software has become more and more widespread. Amongst others, the Linux operating system, the Apache web server and the Firefox internet explorer have taken substantial market shares to their proprietary competitors. Open source software is governed by particular types of licenses. As proprietary licenses only allow the software's use in exchange for a fee, open source licenses grant users more rights like the free use, free copy, free modification and free distribution of the software, as well as free access to the source code. This new phenomenon has raised many managerial questions: organizational issues related to the system of governance that underlie such open source communities (Raymond, 1999a; Lerner and Tirole, 2002; Lee and Cole 2003; Mockus et al. 2000; Tuomi, 2000; Demil and Lecocq, 2006; O'Mahony and Ferraro, 2007;Fleming and Waguespack, 2007), collaborative innovation issues (Von Hippel, 2003; Von Krogh et al., 2003; Von Hippel and Von Krogh, 2003; Dahlander, 2005; Osterloh, 2007; David, 2008), issues related to the nature as well as the motivations of developers (Lerner and Tirole, 2002; Hertel, 2003; Dahlander and McKelvey, 2005; Jeppesen and Frederiksen, 2006), public policy and innovation issues (Jullien and Zimmermann, 2005; Lee, 2006), technological competitions issues related to standard battles between proprietary and open source software (Bonaccorsi and Rossi, 2003; Bonaccorsi et al. 2004, Economides and Katsamakas, 2005; Chen, 2007), intellectual property rights and licensing issues (Laat 2005; Lerner and Tirole, 2005; Gambardella, 2006; Determann et al., 2007). A major unresolved issue concerns open source business models and revenue capture, given that open source licenses imply no fee for users. On this topic, articles show that a commercial activity based on open source software is possible, as they describe different possible ways of doing business around open source (Raymond, 1999; Dahlander, 2004; Daffara, 2007; Bonaccorsi and Merito, 2007). These studies usually look at open source-based companies. Open source-based companies encompass a wide range of firms with different categories of activities: providers of packaged open source solutions, IT Services&Software Engineering firms and open source software publishers. However, business models implications are different for each of these categories: providers of packaged solutions and IT Services&Software Engineering firms' activities are based on software developed outside their boundaries, whereas commercial software publishers sponsor the development of the open source software. This paper focuses on open source software publishers' business models as this issue is even more crucial for this category of firms which take the risk of investing in the development of the software. Literature at last identifies and depicts only two generic types of business models for open source software publishers: the business models of ''bundling'' (Pal and Madanmohan, 2002; Dahlander 2004) and the dual licensing business models (Välimäki, 2003; Comino and Manenti, 2007). Nevertheless, these business models are not applicable in all circumstances. Methodology: The objectives of this paper are: (1) to explore in which contexts the two generic business models described in literature can be implemented successfully and (2) to depict an additional business model for open source software publishers which can be used in a different context. To do so, this paper draws upon an explorative case study of IdealX, a French open source security software publisher. This case study consists in a series of 3 interviews conducted between February 2005 and April 2006 with the co-founder and the business manager. It aims at depicting the process of IdealX's search for the appropriate business model between its creation in 2000 and 2006. This software publisher has tried both generic types of open source software publishers' business models before designing its own. Consequently, through IdealX's trials and errors, I investigate the conditions under which such generic business models can be effective. Moreover, this study describes the business model finally designed and adopted by IdealX: an additional open source software publisher's business model based on the principle of ''mutualisation'', which is applicable in a different context. Results and implications: Finally, this article contributes to ongoing empirical work within entrepreneurship and strategic management on open source software publishers' business models: it provides the characteristics of three generic business models (the business model of bundling, the dual licensing business model and the business model of mutualisation) as well as conditions under which they can be successfully implemented (regarding the type of product developed and the competencies of the firm). This paper also goes further into the traditional concept of business model used by scholars in the open source related literature. In this article, a business model is not only considered as a way of generating incomes (''revenue model'' (Amit and Zott, 2001)), but rather as the necessary conjunction of value creation and value capture, according to the recent literature about business models (Amit and Zott, 2001; Chresbrough and Rosenblum, 2002; Teece, 2007). Consequently, this paper analyses the business models from these two components' point of view.
Resumo:
Designing and estimating civil concrete structures is a complex process which to many practitioners is tied to manual or semi-manual processes of 2D design and cannot be further improved by automated, interacting design-estimating processes. This paper presents a feasibility study for the development an automated estimator for concrete bridge design. The study offers a value proposition: an efficient automated model-based estimator can add value to the whole bridge design-estimating process, i.e., reducing estimation errors, shortening the duration of success estimates, and increasing the benefit of doing cost estimation when compared with the current practice. This is then followed by a description of what is in an efficient automated model-based estimator and how it should be used. Finally the process of model-based estimating is compared with the current practice to highlight the values embedded in the automated processes.
Resumo:
Capital works procurement and its regulatory policy environment within a country can be complex entities. For example, by virtue of Australia’s governmental division between the Commonwealth, states and local jurisdictions and the associated procurement networks and responsibilities at each level, the tendering process is often convoluted. There are four inter-related key themes identified in the literature in relation to procurement disharmony, including decentralisation, risk & risk mitigation, free trade & competition, and tendering costs. This paper defines and discusses these key areas of conflict that adversely impact upon the business environments of industry through a literature review, policy analysis and consultation with capital works procurement stakeholders. The aim of this national study is to identify policy differences between jurisdictions in Australia, and ascertain whether those differences are a barrier to productivity and innovation. This research forms an element of a broader investigation with an aim of developing efficient, effective and nationally harmonised procurement systems. Keywords: capital works, procurement policy reform Acknowledgement: The research described in this paper carried out by the Australian Cooperative Research Centre for Construction Innovation.
Resumo:
Maintaining the health of a construction project can help to achieve the desired outcomes of the project. An analogy is drawn to the medical process of a human health check where it is possible to broadly diagnose health in terms of a number of key areas such as blood pressure or cholesterol level. Similarly it appears possible to diagnose the current health of a construction project in terms of a number of Critical Success Factors (CSFs) and key performance indicators (KPIs). The medical analogy continues into the detailed investigation phase where a number of contributing factors are evaluated to identify possible causes of ill health and through the identification of potential remedies to return the project to the desired level of health. This paper presents the development of a model that diagnoses the immediate health of a construction project, investigates the factors which appear to be causing the ill health and proposes a remedy to return the project to good health. The proposed model uses the well-established continuous improvement management model (Deming, 1986) to adapt the process of human physical health checking to construction project health.
Resumo:
This paper proposes a novel relative entropy rate (RER) based approach for multiple HMM (MHMM) approximation of a class of discrete-time uncertain processes. Under different uncertainty assumptions, the model design problem is posed either as a min-max optimisation problem or stochastic minimisation problem on the RER between joint laws describing the state and output processes (rather than the more usual RER between output processes). A suitable filter is proposed for which performance results are established which bound conditional mean estimation performance and show that estimation performance improves as the RER is reduced. These filter consistency and convergence bounds are the first results characterising multiple HMM approximation performance and suggest that joint RER concepts provide a useful model selection criteria. The proposed model design process and MHMM filter are demonstrated on an important image processing dim-target detection problem.
Resumo:
The current policy decision making in Australia regarding non-health public investments (for example, transport/housing/social welfare programmes) does not quantify health benefits and costs systematically. To address this knowledge gap, this study proposes an economic model for quantifying health impacts of public policies in terms of dollar value. The intention is to enable policy-makers in conducting economic evaluation of health effects of non-health policies and in implementing policies those reduce health inequalities as well as enhance positive health gains of the target population. Health Impact Assessment (HIA) provides an appropriate framework for this study since HIA assesses the beneficial and adverse effects of a programme/policy on public health and on health inequalities through the distribution of those effects. However, HIA usually tries to influence the decision making process using its scientific findings, mostly epidemiological and toxicological evidence. In reality, this evidence can not establish causal links between policy and health impacts since it can not explain how an individual or a community reacts to changing circumstances. The proposed economic model addresses this health-policy linkage using a consumer choice approach that can explain changes in group and individual behaviour in a given economic set up. The economic model suggested in this paper links epidemiological findings with economic analysis to estimate the health costs and benefits of public investment policies. That is, estimating dollar impacts when health status of the exposed population group changes by public programmes – for example, transport initiatives to reduce congestion by building new roads/ highways/ tunnels etc. or by imposing congestion taxes. For policy evaluation purposes, the model is incorporated in the HIA framework by establishing association among identified factors, which drive changes in the behaviour of target population group and in turn, in the health outcomes. The economic variables identified to estimate the health inequality and health costs are levels of income, unemployment, education, age groups, disadvantaged population groups, mortality/morbidity etc. However, though the model validation using case studies and/or available database from Australian non-health policy (say, transport) arena is in the future tasks agenda, it is beyond the scope of this current paper.
Resumo:
Currently, well-established clinical therapeutic approaches for bone reconstruction are restricted to the transplantation of autografts and allografts, and the implantation of metal devices or ceramic-based implants to assist bone regeneration. Bone grafts possess osteoconductive and osteoinductive properties, however they are limited in access and availability and associated with donor site morbidity, haemorrhage, risk of infection, insufficient transplant integration, graft devitalisation, and subsequent resorption resulting in decreased mechanical stability. As a result, recent research focuses on the development of alternative therapeutic concepts. The field of tissue engineering has emerged as an important approach to bone regeneration. However, bench to bedside translations are still infrequent as the process towards approval by regulatory bodies is protracted and costly, requiring both comprehensive in vitro and in vivo studies. The subsequent gap between research and clinical translation, hence commercialization, is referred to as the ‘Valley of Death’ and describes a large number of projects and/or ventures that are ceased due to a lack of funding during the transition from product/technology development to regulatory approval and subsequently commercialization. One of the greatest difficulties in bridging the Valley of Death is to develop good manufacturing processes (GMP) and scalable designs and to apply these in pre-clinical studies. In this article, we describe part of the rationale and road map of how our multidisciplinary research team has approached the first steps to translate orthopaedic bone engineering from bench to bedside byestablishing a pre-clinical ovine critical-sized tibial segmental bone defect model and discuss our preliminary data relating to this decisive step.
Resumo:
The increasing prevalence of International New Ventures (INVs) during the past twenty years has been highlighted by numerous studies (Knight and Cavusgil, 1996, Moen, 2002). International New Ventures are firms, typically small to medium enterprises, that internationalise within six years of inception (Oviatt and McDougall, 1997). To date there has been no general consensus within the literature on a theoretical framework of internationalisation to explain the internationalisation process of INVs (Madsen and Servais, 1997). However, some researchers have suggested that the innovation diffusion model may provide a suitable theoretical framework (Chetty & Hamilton, 1996, Fan & Phan, 2007).The proposed model was based on the existing and well-established innovation diffusion theories drawn from consumer behaviour and internationalisation literature to explain the internationalisation process of INVs (Lim, Sharkey, and Kim, 1991, Reid, 1981, Robertson, 1971, Rogers, 1962, Wickramasekera and Oczkowski, 2006). The results of this analysis indicated that the synthesied model of export adoption was effective in explaining the internationalisation process of INVs within the Queensland Food and Beverage Industry. Significantly the results of the analysis also indicated that features of the original I-models developed in the consumer behaviour literature, that had limited examination within the internationalisation literature were confirmed. This includes the ability of firms, or specifically decision-makers, to skip stages based om previous experience.
Resumo:
The proliferation of innovative schemes to address climate change at international, national and local levels signals a fundamental shift in the priority and role of the natural environment to society, organizations and individuals. This shift in shared priorities invites academics and practitioners to consider the role of institutions in shaping and constraining responses to climate change at multiple levels of organisations and society. Institutional theory provides an approach to conceptualising and addressing climate change challenges by focusing on the central logics that guide society, organizations and individuals and their material and symbolic relationship to the environment. For example, framing a response to climate change in the form of an emission trading scheme evidences a practice informed by a capitalist market logic (Friedland and Alford 1991). However, not all responses need necessarily align with a market logic. Indeed, Thornton (2004) identifies six broad societal sectors each with its own logic (markets, corporations, professions, states, families, religions). Hence, understanding the logics that underpin successful –and unsuccessful– climate change initiatives contributes to revealing how institutions shape and constrain practices, and provides valuable insights for policy makers and organizations. This paper develops models and propositions to consider the construction of, and challenges to, climate change initiatives based on institutional logics (Thornton and Ocasio 2008). We propose that the challenge of understanding and explaining how climate change initiatives are successfully adopted be examined in terms of their institutional logics, and how these logics evolve over time. To achieve this, a multi-level framework of analysis that encompasses society, organizations and individuals is necessary (Friedland and Alford 1991). However, to date most extant studies of institutional logics have tended to emphasize one level over the others (Thornton and Ocasio 2008: 104). In addition, existing studies related to climate change initiatives have largely been descriptive (e.g. Braun 2008) or prescriptive (e.g. Boiral 2006) in terms of the suitability of particular practices. This paper contributes to the literature on logics by examining multiple levels: the proliferation of the climate change agenda provides a site in which to study how institutional logics are played out across multiple, yet embedded levels within society through institutional forums in which change takes place. Secondly, the paper specifically examines how institutional logics provide society with organising principles –material practices and symbolic constructions– which enable and constrain their actions and help define their motives and identity. Based on this model, we develop a series of propositions of the conditions required for the successful introduction of climate change initiatives. The paper proceeds as follows. We present a review of literature related to institutional logics and develop a generic model of the process of the operation of institutional logics. We then consider how this is applied to key initiatives related to climate change. Finally, we develop a series of propositions which might guide insights into the successful implementation of climate change practices.