911 resultados para process model collection


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automated Scheduler is a prototype software tool that automatically prepares a construction schedule together with a 4D simulation of the construction process from a 3D CAD building model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A study has been conducted to investigate current practices on decision-making under risk and uncertainty for infrastructure project investments. It was found that many European countries such as the UK, France, Germany including Australia use scenarios for the investigation of the effects of risk and uncertainty of project investments. Different alternative scenarios are mostly considered during the engineering economic cost-benefit analysis stage. For instance, the World Bank requires an analysis of risks in all project appraisals. Risk in economic evaluation needs to be addressed by calculating sensitivity of the rate of return for a number of events. Risks and uncertainties of project developments arise from various sources of errors including data, model and forecasting errors. It was found that the most influential factors affecting risk and uncertainty resulted from forecasting errors. Data errors and model errors have trivial effects. It was argued by many analysts that scenarios do not forecast what will happen but scenarios indicate only what can happen from given alternatives. It was suggested that the probability distributions of end-products of the project appraisal, such as cost-benefit ratios that take forecasting errors into account, are feasible decision tools for economic evaluation. Political, social, environmental as well as economic and other related risk issues have been addressed and included in decision-making frameworks, such as in a multi-criteria decisionmaking framework. But no suggestion has been made on how to incorporate risk into the investment decision-making process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ability to assess a commercial building for its impact on the environment at the earliest stage of design is a goal which is achievable by integrating several approaches into a single procedure directly from the 3D CAD representation. Such an approach enables building design professionals to make informed decisions on the environmental impact of building and its alternatives during the design development stage instead of at the post-design stage where options become limited. The indicators of interest are those which relate to consumption of resources and energy, contributions to pollution of air, water and soil, and impacts on the health and wellbeing of people in the built environment as a result of constructing and operating buildings. 3D object-oriented CAD files contain a wealth of building information which can be interrogated for details required for analysis of the performance of a design. The quantities of all components in the building can be automatically obtained from the 3D CAD objects and their constituent materials identified to calculate a complete list of the amounts of all building products such as concrete, steel, timber, plastic etc. When this information is combined with a life cycle inventory database, key internationally recognised environmental indicators can be estimated. Such a fully integrated tool known as LCADesign has been created for automated ecoefficiency assessment of commercial buildings direct from 3D CAD. This paper outlines the key features of LCADesign and its application to environmental assessment of commercial buildings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIMM stands for 'Agents for Improved Maintenance Management.' The AIMM system is a prototype tool that has developed the state of the art life cycle modelling of buildings through the linking of a 3D model with maintenance data to allow both the facility manager and the designer to gain access to building maintenance information and knowledge that is currently inaccessible. AIMM integrates data mining agents into the maintenance process to produce timely data for the facility manager on the effects of different maintenance regimes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a novel relative entropy rate (RER) based approach for multiple HMM (MHMM) approximation of a class of discrete-time uncertain processes. Under different uncertainty assumptions, the model design problem is posed either as a min-max optimisation problem or stochastic minimisation problem on the RER between joint laws describing the state and output processes (rather than the more usual RER between output processes). A suitable filter is proposed for which performance results are established which bound conditional mean estimation performance and show that estimation performance improves as the RER is reduced. These filter consistency and convergence bounds are the first results characterising multiple HMM approximation performance and suggest that joint RER concepts provide a useful model selection criteria. The proposed model design process and MHMM filter are demonstrated on an important image processing dim-target detection problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The current policy decision making in Australia regarding non-health public investments (for example, transport/housing/social welfare programmes) does not quantify health benefits and costs systematically. To address this knowledge gap, this study proposes an economic model for quantifying health impacts of public policies in terms of dollar value. The intention is to enable policy-makers in conducting economic evaluation of health effects of non-health policies and in implementing policies those reduce health inequalities as well as enhance positive health gains of the target population. Health Impact Assessment (HIA) provides an appropriate framework for this study since HIA assesses the beneficial and adverse effects of a programme/policy on public health and on health inequalities through the distribution of those effects. However, HIA usually tries to influence the decision making process using its scientific findings, mostly epidemiological and toxicological evidence. In reality, this evidence can not establish causal links between policy and health impacts since it can not explain how an individual or a community reacts to changing circumstances. The proposed economic model addresses this health-policy linkage using a consumer choice approach that can explain changes in group and individual behaviour in a given economic set up. The economic model suggested in this paper links epidemiological findings with economic analysis to estimate the health costs and benefits of public investment policies. That is, estimating dollar impacts when health status of the exposed population group changes by public programmes – for example, transport initiatives to reduce congestion by building new roads/ highways/ tunnels etc. or by imposing congestion taxes. For policy evaluation purposes, the model is incorporated in the HIA framework by establishing association among identified factors, which drive changes in the behaviour of target population group and in turn, in the health outcomes. The economic variables identified to estimate the health inequality and health costs are levels of income, unemployment, education, age groups, disadvantaged population groups, mortality/morbidity etc. However, though the model validation using case studies and/or available database from Australian non-health policy (say, transport) arena is in the future tasks agenda, it is beyond the scope of this current paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report presents the current state and approach in Building Information Modelling (BIM). The report is focussed at providing a desktop audit of the current state and capabilities of the products and applications supporting BIM. This includes discussion on BIM model servers as well as discipline specific applications, for which the distinction is explained below. The report presented here is aimed at giving a broad overview of the tools and applications with respect to their BIM capabilities and in no way claims to be an exhaustive report for individual tools. Chapter 4 of the report includes the research and development agendas pertaining to the BIM approach based on the observations and analysis from the desktop audit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently, well-established clinical therapeutic approaches for bone reconstruction are restricted to the transplantation of autografts and allografts, and the implantation of metal devices or ceramic-based implants to assist bone regeneration. Bone grafts possess osteoconductive and osteoinductive properties, however they are limited in access and availability and associated with donor site morbidity, haemorrhage, risk of infection, insufficient transplant integration, graft devitalisation, and subsequent resorption resulting in decreased mechanical stability. As a result, recent research focuses on the development of alternative therapeutic concepts. The field of tissue engineering has emerged as an important approach to bone regeneration. However, bench to bedside translations are still infrequent as the process towards approval by regulatory bodies is protracted and costly, requiring both comprehensive in vitro and in vivo studies. The subsequent gap between research and clinical translation, hence commercialization, is referred to as the ‘Valley of Death’ and describes a large number of projects and/or ventures that are ceased due to a lack of funding during the transition from product/technology development to regulatory approval and subsequently commercialization. One of the greatest difficulties in bridging the Valley of Death is to develop good manufacturing processes (GMP) and scalable designs and to apply these in pre-clinical studies. In this article, we describe part of the rationale and road map of how our multidisciplinary research team has approached the first steps to translate orthopaedic bone engineering from bench to bedside byestablishing a pre-clinical ovine critical-sized tibial segmental bone defect model and discuss our preliminary data relating to this decisive step.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Queensland University of Technology (QUT) is faced with a rapidly growing research agenda built upon a strategic research capacity-building program. This presentation will outline the results of a project that has recently investigated QUT’s research support requirements and which has developed a model for the support of eResearch across the university. QUT’s research building strategy has produced growth at the faculty level and within its research institutes. This increased research activity is pushing the need for university-wide eResearch platforms capable of providing infrastructure and support in areas such as collaboration, data, networking, authentication and authorisation, workflows and the grid. One of the driving forces behind the investigation is data-centric nature of modern research. It is now critical that researchers have access to supported infrastructure that allows the collection, analysis, aggregation and sharing of large data volumes for exploration and mining in order to gain new insights and to generate new knowledge. However, recent surveys into current research data management practices by the Australian Partnership for Sustainable Repositories (APSR) and by QUT itself, has revealed serious shortcomings in areas such as research data management, especially its long term maintenance for reuse and authoritative evidence of research findings. While these internal university pressures are building, at the same time there are external pressures that are magnifying them. For example, recent compliance guidelines from bodies such as the ARC, and NHMRC and Universities Australia indicate that institutions need to provide facilities for the safe and secure storage of research data along with a surrounding set of policies, on its retention, ownership and accessibility. The newly formed Australian National Data Service (ANDS) is developing strategies and guidelines for research data management and research institutions are a central focus, responsible for managing and storing institutional data on platforms that can be federated nationally and internationally for wider use. For some time QUT has recognised the importance of eResearch and has been active in a number of related areas: ePrints to digitally publish research papers, grid computing portals and workflows, institutional-wide provisioning and authentication systems, and legal protocols for copyright management. QUT also has two widely recognised centres focused on fundamental research into eResearch itself: The OAK LAW project (Open Access to Knowledge) which focuses upon legal issues relating eResearch and the Microsoft QUT eResearch Centre whose goal is to accelerate scientific research discovery, through new smart software. In order to better harness all of these resources and improve research outcomes, the university recently established a project to investigate how it might better organise the support of eResearch. This presentation will outline the project outcomes, which include a flexible and sustainable eResearch support service model addressing short and longer term research needs, identification of resource requirements required to establish and sustain the service, and the development of research data management policies and implementation plans.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing prevalence of International New Ventures (INVs) during the past twenty years has been highlighted by numerous studies (Knight and Cavusgil, 1996, Moen, 2002). International New Ventures are firms, typically small to medium enterprises, that internationalise within six years of inception (Oviatt and McDougall, 1997). To date there has been no general consensus within the literature on a theoretical framework of internationalisation to explain the internationalisation process of INVs (Madsen and Servais, 1997). However, some researchers have suggested that the innovation diffusion model may provide a suitable theoretical framework (Chetty & Hamilton, 1996, Fan & Phan, 2007).The proposed model was based on the existing and well-established innovation diffusion theories drawn from consumer behaviour and internationalisation literature to explain the internationalisation process of INVs (Lim, Sharkey, and Kim, 1991, Reid, 1981, Robertson, 1971, Rogers, 1962, Wickramasekera and Oczkowski, 2006). The results of this analysis indicated that the synthesied model of export adoption was effective in explaining the internationalisation process of INVs within the Queensland Food and Beverage Industry. Significantly the results of the analysis also indicated that features of the original I-models developed in the consumer behaviour literature, that had limited examination within the internationalisation literature were confirmed. This includes the ability of firms, or specifically decision-makers, to skip stages based om previous experience.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The proliferation of innovative schemes to address climate change at international, national and local levels signals a fundamental shift in the priority and role of the natural environment to society, organizations and individuals. This shift in shared priorities invites academics and practitioners to consider the role of institutions in shaping and constraining responses to climate change at multiple levels of organisations and society. Institutional theory provides an approach to conceptualising and addressing climate change challenges by focusing on the central logics that guide society, organizations and individuals and their material and symbolic relationship to the environment. For example, framing a response to climate change in the form of an emission trading scheme evidences a practice informed by a capitalist market logic (Friedland and Alford 1991). However, not all responses need necessarily align with a market logic. Indeed, Thornton (2004) identifies six broad societal sectors each with its own logic (markets, corporations, professions, states, families, religions). Hence, understanding the logics that underpin successful –and unsuccessful– climate change initiatives contributes to revealing how institutions shape and constrain practices, and provides valuable insights for policy makers and organizations. This paper develops models and propositions to consider the construction of, and challenges to, climate change initiatives based on institutional logics (Thornton and Ocasio 2008). We propose that the challenge of understanding and explaining how climate change initiatives are successfully adopted be examined in terms of their institutional logics, and how these logics evolve over time. To achieve this, a multi-level framework of analysis that encompasses society, organizations and individuals is necessary (Friedland and Alford 1991). However, to date most extant studies of institutional logics have tended to emphasize one level over the others (Thornton and Ocasio 2008: 104). In addition, existing studies related to climate change initiatives have largely been descriptive (e.g. Braun 2008) or prescriptive (e.g. Boiral 2006) in terms of the suitability of particular practices. This paper contributes to the literature on logics by examining multiple levels: the proliferation of the climate change agenda provides a site in which to study how institutional logics are played out across multiple, yet embedded levels within society through institutional forums in which change takes place. Secondly, the paper specifically examines how institutional logics provide society with organising principles –material practices and symbolic constructions– which enable and constrain their actions and help define their motives and identity. Based on this model, we develop a series of propositions of the conditions required for the successful introduction of climate change initiatives. The paper proceeds as follows. We present a review of literature related to institutional logics and develop a generic model of the process of the operation of institutional logics. We then consider how this is applied to key initiatives related to climate change. Finally, we develop a series of propositions which might guide insights into the successful implementation of climate change practices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The paper aims to describe a workforce-planning model developed in-house in an Australian university library that is based on rigorous environmental scanning of an institution, the profession and the sector. Design/methodology/approach – The paper uses a case study that describes the stages of the planning process undertaken to develop the Library’s Workforce Plan and the documentation produced. Findings – While it has been found that the process has had successful and productive outcomes, workforce planning is an ongoing process. To remain effective, the workforce plan needs to be reviewed annually in the context of the library’s overall planning program. This is imperative if the plan is to remain current and to be regarded as a living document that will continue to guide library practice. Research limitations/implications – Although a single case study, the work has been contextualized within the wider research into workforce planning. Practical implications – The paper provides a model that can easily be deployed within a library without external or specialist consultant skills, and due to its scalability can be applied at department or wider level. Originality/value – The paper identifies the trends impacting on, and the emerging opportunities for, university libraries and provides a model for workforce planning that recognizes the context and culture of the organization as key drivers in determining workforce planning. Keywords - Australia, University libraries, Academic libraries, Change management, Manpower planning Paper type - Case study

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports preliminary results from a study modeling the interplay between multitasking, cognitive coordination, and cognitive shifts during Web search. Study participants conducted three Web searches on personal information problems. Data collection techniques included pre- and post-search questionnaires; think-aloud protocols, Web search logs, observation, and post-search interviews. Key findings include: (1) users Web searches included multitasking, cognitive shifting and cognitive coordination processes, (2) cognitive coordination is the hinge linking multitasking and cognitive shifting that enables Web search construction, (3) cognitive shift levels determine the process of cognitive coordination, and (4) cognitive coordination is interplay of task, mechanism and strategy levels that underpin multitasking and task switching. An initial model depicts the interplay between multitasking, cognitive coordination, and cognitive shifts during Web search. Implications of the findings and further research are also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With service interaction modelling, it is customary to distinguish between two types of models: choreographies and orchestrations. A choreography describes interactions within a collection of services from a global perspective, where no service plays a privileged role. Instead, services interact in a peer-to-peer manner. In contrast, an orchestration describes the interactions between one particular service, the orchestrator, and a number of partner services. The main proposition of this work is an approach to bridge these two modelling viewpoints by synthesising orchestrators from choreographies. To start with, choreographies are defined using a simple behaviour description language based on communicating finite state machines. From such a model, orchestrators are initially synthesised in the form of state machines. It turns out that state machines are not suitable for orchestration modelling, because orchestrators generally need to engage in concurrent interactions. To address this issue, a technique is proposed to transform state machines into process models in the Business Process Modelling Notation (BPMN). Orchestrations represented in BPMN can then be augmented with additional business logic to achieve value-adding mediation. In addition, techniques exist for refining BPMN models into executable process definitions. The transformation from state machines to BPMN relies on Petri nets as an intermediary representation and leverages techniques from theory of regions to identify concurrency in the initial Petri net. Once concurrency has been identified, the resulting Petri net is transformed into a BPMN model. The original contributions of this work are: an algorithm to synthesise orchestrators from choreographies and a rules-based transformation from Petri nets into BPMN.