521 resultados para Conceptual site models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although Design Led Innovation activities aim to raise the value of design within the business, knowledge about which tools are available to support companies and how to apply them to make the connection between design for new product development and design as a strategic driver of growth is needed. This paper presents a conceptual method to supplement existing process and tools to assist companies to grow through design. The model extends the authors’ previous work to explore how through storytelling, customer observation can be captured and translated into new meaning, then creating new design propositions shaped into product needs, which can drive internal business activities, brand and the strategic vision. The paper contributes to a gap in the theoretical frameworks and literature by highlighting the need to align and scale design processes which match the needs of SME’s as they transition along a trajectory to become design led businesses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Increasingly, studies are reported that examine how conceptual modeling is conducted in practice. Yet, typically the studies to date have examined in isolation how modeling grammars can be, or are, used to develop models of information systems or organizational processes, without considering that such modeling is typically done by means of a modeling tool that extends the modeling functionality offered by a grammar through complementary features. This paper extends the literature by examining how the use of seven different features of modeling tools affects usage beliefs users develop when using modeling grammars for process modeling. We show that five distinct tool features positively affect usefulness, ease of use and satisfaction beliefs of users. We offer a number of interpretations about the findings. We also describe how the results inform decisions of relevance to developers of modeling tools as well as managers in charge for making modeling-related investment decisions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Unstructured text data, such as emails, blogs, contracts, academic publications, organizational documents, transcribed interviews, and even tweets, are important sources of data in Information Systems research. Various forms of qualitative analysis of the content of these data exist and have revealed important insights. Yet, to date, these analyses have been hampered by limitations of human coding of large data sets, and by bias due to human interpretation. In this paper, we compare and combine two quantitative analysis techniques to demonstrate the capabilities of computational analysis for content analysis of unstructured text. Specifically, we seek to demonstrate how two quantitative analytic methods, viz., Latent Semantic Analysis and data mining, can aid researchers in revealing core content topic areas in large (or small) data sets, and in visualizing how these concepts evolve, migrate, converge or diverge over time. We exemplify the complementary application of these techniques through an examination of a 25-year sample of abstracts from selected journals in Information Systems, Management, and Accounting disciplines. Through this work, we explore the capabilities of two computational techniques, and show how these techniques can be used to gather insights from a large corpus of unstructured text.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Technologies and languages for integrated processes are a relatively recent innovation. Over that period many divergent waves of innovation have transformed process integration. Like sockets and distributed objects, early workflow systems ordered programming interfaces that connected the process modelling layer to any middleware. BPM systems emerged later, connecting the modelling world to middleware through components. While BPM systems increased ease of use (modelling convenience), long-standing and complex interactions involving many process instances remained di±cult to model. Enterprise Service Buses (ESBs), followed, connecting process models to heterogeneous forms of middleware. ESBs, however, generally forced modellers to choose a particular underlying middleware and to stick to it, despite their ability to connect with many forms of middleware. Furthermore ESBs encourage process integrations to be modelled on their own, logically separate from the process model. This can lead to the inability to reason about long standing conversations at the process layer. Technologies and languages for process integration generally lack formality. This has led to arbitrariness in the underlying language building blocks. Conceptual holes exist in a range of technologies and languages for process integration and this can lead to customer dissatisfaction and failure to bring integration projects to reach their potential. Standards for process integration share similar fundamental flaws to languages and technologies. Standards are also in direct competition with other standards causing a lack of clarity. Thus the area of greatest risk in a BPM project remains process integration, despite major advancements in the technology base. This research examines some fundamental aspects of communication middleware and how these fundamental building blocks of integration can be brought to the process modelling layer in a technology agnostic manner. This way process modelling can be conceptually complete without becoming stuck in a particular middleware technology. Coloured Petri nets are used to define a formal semantics for the fundamental aspects of communication middleware. They provide the means to define and model the dynamic aspects of various integration middleware. Process integration patterns are used as a tool to codify common problems to be solved. Object Role Modelling is a formal modelling technique that was used to define the syntax of a proposed process integration language. This thesis provides several contributions to the field of process integration. It proposes a framework defining the key notions of integration middleware. This framework provides a conceptual foundation upon which a process integration language could be built. The thesis defines an architecture that allows various forms of middleware to be aggregated and reasoned about at the process layer. This thesis provides a comprehensive set of process integration patterns. These constitute a benchmark for the kinds of problems a process integration language must support. The thesis proposes a process integration modelling language and a partial implementation that is able to enact the language. A process integration pilot project in a German hospital is brie°y described at the end of the thesis. The pilot is based on ideas in this thesis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Currently, well established clinical therapeutic approaches for bone reconstruction are restricted to the transplantation of autografts and allografts, and the implantation of metal devices or ceramic-based implants to assist bone regeneration. Bone grafts possess osteoconductive and osteoinductive properties, their application, however, is associated with disadvantages. These include limited access and availability, donor site morbidity and haemorrhage, increased risk of infection, and insufficient transplant integration. As a result, recent research focuses on the development of complementary therapeutic concepts. The field of tissue engineering has emerged as an important alternative approach to bone regeneration. Tissue engineering unites aspects of cellular biology, biomechanical engineering, biomaterial sciences and trauma and orthopaedic surgery. To obtain approval by regulatory bodies for these novel therapeutic concepts the level of therapeutic benefit must be demonstrated rigorously in well characterized, clinically relevant animal models. Therefore, in this PhD project, a reproducible and clinically relevant, ovine, critically sized, high load bearing, tibial defect model was established and characterized as a prerequisite to assess the regenerative potential of a novel treatment concept in vivo involving a medical grade polycaprolactone and tricalciumphosphate based composite scaffold and recombinant human bone morphogenetic proteins.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Overcoming many of the constraints to early stage investment in biofuels production from sugarcane bagasse in Australia requires an understanding of the complex technical, economic and systemic challenges associated with the transition of established sugar industry structures from single product agri-businesses to new diversified multi-product biorefineries. While positive investment decisions in new infrastructure requires technically feasible solutions and the attainment of project economic investment thresholds, many other systemic factors will influence the investment decision. These factors include the interrelationships between feedstock availability and energy use, competing product alternatives, technology acceptance and perceptions of project uncertainty and risk. This thesis explores the feasibility of a new cellulosic ethanol industry in Australia based on the large sugarcane fibre (bagasse) resource available. The research explores industry feasibility from multiple angles including the challenges of integrating ethanol production into an established sugarcane processing system, scoping the economic drivers and key variables relating to bioethanol projects and considering the impact of emerging technologies in improving industry feasibility. The opportunities available from pilot scale technology demonstration are also addressed. Systems analysis techniques are used to explore the interrelationships between the existing sugarcane industry and the developing cellulosic biofuels industry. This analysis has resulted in the development of a conceptual framework for a bagassebased cellulosic ethanol industry in Australia and uses this framework to assess the uncertainty in key project factors and investment risk. The analysis showed that the fundamental issue affecting investment in a cellulosic ethanol industry from sugarcane in Australia is the uncertainty in the future price of ethanol and government support that reduces the risks associated with early stage investment is likely to be necessary to promote commercialisation of this novel technology. Comprehensive techno-economic models have been developed and used to assess the potential quantum of ethanol production from sugarcane in Australia, to assess the feasibility of a soda-based biorefinery at the Racecourse Sugar Mill in Mackay, Queensland and to assess the feasibility of reducing the cost of production of fermentable sugars from the in-planta expression of cellulases in sugarcane in Australia. These assessments show that ethanol from sugarcane in Australia has the potential to make a significant contribution to reducing Australia’s transportation fuel requirements from fossil fuels and that economically viable projects exist depending upon assumptions relating to product price, ethanol taxation arrangements and greenhouse gas emission reduction incentives. The conceptual design and development of a novel pilot scale cellulosic ethanol research and development facility is also reported in this thesis. The establishment of this facility enables the technical and economic feasibility of new technologies to be assessed in a multi-partner, collaborative environment. As a key outcome of this work, this study has delivered a facility that will enable novel cellulosic ethanol technologies to be assessed in a low investment risk environment, reducing the potential risks associated with early stage investment in commercial projects and hence promoting more rapid technology uptake. While the study has focussed on an exploration of the feasibility of a commercial cellulosic ethanol industry from sugarcane in Australia, many of the same key issues will be of relevance to other sugarcane industries throughout the world seeking diversification of revenue through the implementation of novel cellulosic ethanol technologies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays, business process management is an important approach for managing organizations from an operational perspective. As a consequence, it is common to see organizations develop collections of hundreds or even thousands of business process models. Such large collections of process models bring new challenges and provide new opportunities, as the knowledge that they encapsulate requires to be properly managed. Therefore, a variety of techniques for managing large collections of business process models is being developed. The goal of this paper is to provide an overview of the management techniques that currently exist, as well as the open research challenges that they pose.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Emergency departments (EDs) are critical to the management of acute illness and injury, and the provision of health system access. However, EDs have become increasingly congested due to increased demand, increased complexity of care and blocked access to ongoing care (access block). Congestion has clinical and organisational implications. This paper aims to describe the factors that appear to infl uence demand for ED services, and their interrelationships as the basis for further research into the role of private hospital EDs. DATA SOURCES: Multiple databases (PubMed, ProQuest, Academic Search Elite and Science Direct) and relevant journals were searched using terms related to EDs and emergency health needs. Literature pertaining to emergency department utilisation worldwide was identified, and articles selected for further examination on the basis of their relevance and significance to ED demand. RESULTS: Factors influencing ED demand can be categorized into those describing the health needs of the patients, those predisposing a patient to seeking help, and those relating to policy factors such as provision of services and insurance status. This paper describes the factors influencing ED presentations, and proposes a novel conceptual map of their interrelationship. CONCLUSION: This review has explored the factors contributing to the growing demand for ED care, the influence these factors have on ED demand, and their interrelationships depicted in the conceptual model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is concerned with investigating existing and potential scope of Dublin Core metadata in Knowledge Management contexts. Modelling knowledge is identified as a conceptual prerequisite in this investigation, principally for the purpose of clarifying scope prior to identifying the range of tasks associated with organising knowledge. A variety of models is presented and relationships between data, information, and knowledge discussed. It is argued that the two most common modes of organisation, hierarchies and networks, influence the effectiveness and flow of knowledge. Practical perspective is provided by reference to implementations and projects providing evidence of how DC metadata is applied in such contexts. A sense-making model is introduced that can be used as a shorthand reference for identifying useful facets of knowledge that might be described using metadata. Discussion is aimed at presenting this model in a way that both validates current applications and points to potential novel applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is an intimate interconnectivity between policy guidelines defining reform and the delineation of what research methods would be subsequently applied to determine reform success. Research is guided as much by the metaphors describing it as by the ensuing empirical definition of actions of results obtained from it. In a call for different reform policy metaphors Lumby and English (2010) note, “The primary responsibility for the parlous state of education... lies with the policy makers that have racked our schools with reductive and dehumanizing processes, following the metaphors of market efficiency, and leadership models based on accounting and the characteristics of machine bureaucracy” (p. 127)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an approach to building an observation likelihood function from a set of sparse, noisy training observations taken from known locations by a sensor with no obvious geometric model. The basic approach is to fit an interpolant to the training data, representing the expected observation, and to assume additive sensor noise. This paper takes a Bayesian view of the problem, maintaining a posterior over interpolants rather than simply the maximum-likelihood interpolant, giving a measure of uncertainty in the map at any point. This is done using a Gaussian process framework. To validate the approach experimentally, a model of an environment is built using observations from an omni-directional camera. After a model has been built from the training data, a particle filter is used to localise while traversing this environment

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern technology now has the ability to generate large datasets over space and time. Such data typically exhibit high autocorrelations over all dimensions. The field trial data motivating the methods of this paper were collected to examine the behaviour of traditional cropping and to determine a cropping system which could maximise water use for grain production while minimising leakage below the crop root zone. They consist of moisture measurements made at 15 depths across 3 rows and 18 columns, in the lattice framework of an agricultural field. Bayesian conditional autoregressive (CAR) models are used to account for local site correlations. Conditional autoregressive models have not been widely used in analyses of agricultural data. This paper serves to illustrate the usefulness of these models in this field, along with the ease of implementation in WinBUGS, a freely available software package. The innovation is the fitting of separate conditional autoregressive models for each depth layer, the ‘layered CAR model’, while simultaneously estimating depth profile functions for each site treatment. Modelling interest also lay in how best to model the treatment effect depth profiles, and in the choice of neighbourhood structure for the spatial autocorrelation model. The favoured model fitted the treatment effects as splines over depth, and treated depth, the basis for the regression model, as measured with error, while fitting CAR neighbourhood models by depth layer. It is hierarchical, with separate onditional autoregressive spatial variance components at each depth, and the fixed terms which involve an errors-in-measurement model treat depth errors as interval-censored measurement error. The Bayesian framework permits transparent specification and easy comparison of the various complex models compared.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The lack of satisfactory consensus for characterizing the system intelligence and structured analytical decision models has inhibited the developers and practitioners to understand and configure optimum intelligent building systems in a fully informed manner. So far, little research has been conducted in this aspect. This research is designed to identify the key intelligent indicators, and develop analytical models for computing the system intelligence score of smart building system in the intelligent building. The integrated building management system (IBMS) was used as an illustrative example to present a framework. The models presented in this study applied the system intelligence theory, and the conceptual analytical framework. A total of 16 key intelligent indicators were first identified from a general survey. Then, two multi-criteria decision making (MCDM) approaches, the analytic hierarchy process (AHP) and analytic network process (ANP), were employed to develop the system intelligence analytical models. Top intelligence indicators of IBMS include: self-diagnostic of operation deviations; adaptive limiting control algorithm; and, year-round time schedule performance. The developed conceptual framework was then transformed to the practical model. The effectiveness of the practical model was evaluated by means of expert validation. The main contribution of this research is to promote understanding of the intelligent indicators, and to set the foundation for a systemic framework that provide developers and building stakeholders a consolidated inclusive tool for the system intelligence evaluation of the proposed components design configurations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Texas Department of Transportation (TxDOT) is concerned about the widening gap between preservation needs and available funding. Funding levels are not adequate to meet the preservation needs of the roadway network; therefore projects listed in the 4-Year Pavement Management Plan must be ranked to determine which projects should be funded now and which can be postponed until a later year. Currently, each district uses locally developed methods to prioritize projects. These ranking methods have relied on less formal qualitative assessments based on engineers’ subjective judgment. It is important for TxDOT to have a 4-Year Pavement Management Plan that uses a transparent, rational project ranking process. The objective of this study is to develop a conceptual framework that describes the development of the 4-Year Pavement Management Plan. It can be largely divided into three Steps; 1) Network-Level project screening process, 2) Project-Level project ranking process, and 3) Economic Analysis. A rational pavement management procedure and a project ranking method accepted by districts and the TxDOT administration will maximize efficiency in budget allocations and will potentially help improve pavement condition. As a part of the implementation of the 4-Year Pavement Management Plan, the Network-Level Project Screening (NLPS) tool including the candidate project identification algorithm and the preliminary project ranking matrix was developed. The NLPS has been used by the Austin District Pavement Engineer (DPE) to evaluate PMIS (Pavement Management Information System) data and to prepare a preliminary list of candidate projects for further evaluation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sourcing appropriate funding for the provision of new urban infrastructure has been a policy dilemma for governments around the world for decades. This is particularly relevant in high growth areas where new services are required to support swelling populations. The Australian infrastructure funding policy dilemmas are reflective of similar matters in many countries, particularly the United States of America, where infrastructure cost recovery policies have been in place since the 1970’s. There is an extensive body of both theoretical and empirical literature from these countries that discusses the passing on (to home buyers) of these infrastructure charges, and the corresponding impact on housing prices. The theoretical evidence is consistent in its findings that infrastructure charges are passed on to home buyers by way of higher house prices. The empirical evidence is also consistent in its findings, with “overshifting” of these charges evident in all models since the 1980’s, i.e. $1 infrastructure charge results in greater than $1 increase in house prices. However, despite over a dozen separate studies over two decades in the US on this topic, no empirical works have been carried out in Australia to test if similar shifting or overshifting occurs here. The purpose of this research is to conduct a preliminary analysis of the more recent models used in these US empirical studies in order to identify the key study area selection criteria and success factors. The paper concludes that many of the study area selection criteria are implicit rather than explicit. By collecting data across the models, some implicit criteria become apparent, whilst others remain elusive. This data will inform future research on whether an existing model can be adopted or adapted for use in Australia.