864 resultados para Process model alignment


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Business process management systems (BPMS) belong to a class of enterprise information systems that are characterized by the dependence on explicitly modeled process logic. Through the process logic, it is relatively easy to manage explicitly the routing and allocation of work items along a business process through the system. Inspired by the DeLone and McLean framework, we theorize that these process-aware system features are important attributes of system quality, which in turn will elevate key user evaluations such as perceived usefulness, and usage satisfaction. We examine this theoretical model using data collected from four different, mostly mature BPM system projects. Our findings validate the importance of input quality as well as allocation and routing attributes as antecedents of system quality, which, in turn, determines both usefulness and satisfaction with the system. We further demonstrate how service quality and workflow dependency are significant precursors to perceived usefulness. Our results suggest the appropriateness of a multi-dimensional conception of system quality for future research, and provide important design-oriented advice for the design and configuration of BPMSs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tilting-pad hydrodynamic thrust bearings are used in hydroelectric power stations around the world, reliably supporting turbines weighing hundreds of tonnes, over decades of service. Newer designs incorporate hydrostatic recesses machined into the sector-shaped pads to enhance oil film thickness at low rotational speeds. External pressurisation practically eliminates wear and enhances service life and reliability. It follows that older generating plants, lacking such assistance, stand to benefit from being retrofitted with hydrostatic lubrication systems. The design process is not trivial however. The need to increase the groove size to permit spontaneous lifting of the turbine under hydrostatic pressure, conflicts with the need to preserve performance of the original plane pad design. A haphazardly designed recess can induce a significant rise in bearing temperature concomitant with reduced mechanical efficiency and risk of thermal damage. In this work, a numerical study of a sector-shaped pad is undertaken to demonstrate how recess size and shape can affect the performance of a typical bearing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the last several decades, the quality of natural resources and their services have been exposed to significant degradation from increased urban populations combined with the sprawl of settlements, development of transportation networks and industrial activities (Dorsey, 2003; Pauleit et al., 2005). As a result of this environmental degradation, a sustainable framework for urban development is required to provide the resilience of natural resources and ecosystems. Sustainable urban development refers to the management of cities with adequate infrastructure to support the needs of its population for the present and future generations as well as maintain the sustainability of its ecosystems (UNEP/IETC, 2002; Yigitcanlar, 2010). One of the important strategic approaches for planning sustainable cities is „ecological planning‟. Ecological planning is a multi-dimensional concept that aims to preserve biodiversity richness and ecosystem productivity through the sustainable management of natural resources (Barnes et al., 2005). As stated by Baldwin (1985, p.4), ecological planning is the initiation and operation of activities to direct and control the acquisition, transformation, disruption and disposal of resources in a manner capable of sustaining human activities with a minimum disruption of ecosystem processes. Therefore, ecological planning is a powerful method for creating sustainable urban ecosystems. In order to explore the city as an ecosystem and investigate the interaction between the urban ecosystem and human activities, a holistic urban ecosystem sustainability assessment approach is required. Urban ecosystem sustainability assessment serves as a tool that helps policy and decision-makers in improving their actions towards sustainable urban development. There are several methods used in urban ecosystem sustainability assessment among which sustainability indicators and composite indices are the most commonly used tools for assessing the progress towards sustainable land use and urban management. Currently, a variety of composite indices are available to measure the sustainability at the local, national and international levels. However, the main conclusion drawn from the literature review is that they are too broad to be applied to assess local and micro level sustainability and no benchmark value for most of the indicators exists due to limited data availability and non-comparable data across countries. Mayer (2008, p. 280) advocates that by stating "as different as the indices may seem, many of them incorporate the same underlying data because of the small number of available sustainability datasets". Mori and Christodoulou (2011) also argue that this relative evaluation and comparison brings along biased assessments, as data only exists for some entities, which also means excluding many nations from evaluation and comparison. Thus, there is a need for developing an accurate and comprehensive micro-level urban ecosystem sustainability assessment method. In order to develop such a model, it is practical to adopt an approach that uses a method to utilise indicators for collecting data, designate certain threshold values or ranges, perform a comparative sustainability assessment via indices at the micro-level, and aggregate these assessment findings to the local level. Hereby, through this approach and model, it is possible to produce sufficient and reliable data to enable comparison at the local level, and provide useful results to inform the local planning, conservation and development decision-making process to secure sustainable ecosystems and urban futures. To advance research in this area, this study investigated the environmental impacts of an existing urban context by using a composite index with an aim to identify the interaction between urban ecosystems and human activities in the context of environmental sustainability. In this respect, this study developed a new comprehensive urban ecosystem sustainability assessment tool entitled the „Micro-level Urban-ecosystem Sustainability IndeX‟ (MUSIX). The MUSIX model is an indicator-based indexing model that investigates the factors affecting urban sustainability in a local context. The model outputs provide local and micro-level sustainability reporting guidance to help policy-making concerning environmental issues. A multi-method research approach, which is based on both quantitative analysis and qualitative analysis, was employed in the construction of the MUSIX model. First, a qualitative research was conducted through an interpretive and critical literature review in developing a theoretical framework and indicator selection. Afterwards, a quantitative research was conducted through statistical and spatial analyses in data collection, processing and model application. The MUSIX model was tested in four pilot study sites selected from the Gold Coast City, Queensland, Australia. The model results detected the sustainability performance of current urban settings referring to six main issues of urban development: (1) hydrology, (2) ecology, (3) pollution, (4) location, (5) design, and; (6) efficiency. For each category, a set of core indicators was assigned which are intended to: (1) benchmark the current situation, strengths and weaknesses, (2) evaluate the efficiency of implemented plans, and; (3) measure the progress towards sustainable development. While the indicator set of the model provided specific information about the environmental impacts in the area at the parcel scale, the composite index score provided general information about the sustainability of the area at the neighbourhood scale. Finally, in light of the model findings, integrated ecological planning strategies were developed to guide the preparation and assessment of development and local area plans in conjunction with the Gold Coast Planning Scheme, which establishes regulatory provisions to achieve ecological sustainability through the formulation of place codes, development codes, constraint codes and other assessment criteria that provide guidance for best practice development solutions. These relevant strategies can be summarised as follows: • Establishing hydrological conservation through sustainable stormwater management in order to preserve the Earth’s water cycle and aquatic ecosystems; • Providing ecological conservation through sustainable ecosystem management in order to protect biological diversity and maintain the integrity of natural ecosystems; • Improving environmental quality through developing pollution prevention regulations and policies in order to promote high quality water resources, clean air and enhanced ecosystem health; • Creating sustainable mobility and accessibility through designing better local services and walkable neighbourhoods in order to promote safe environments and healthy communities; • Sustainable design of urban environment through climate responsive design in order to increase the efficient use of solar energy to provide thermal comfort, and; • Use of renewable resources through creating efficient communities in order to provide long-term management of natural resources for the sustainability of future generations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The affordances concept describes the possibilities for goal-oriented action that technical objects offer to specified users. This notion has received growing attention from IS researchers. However, few studies have gone beyond contextualizing parts of the concept to a specific setting – the tip of the iceberg. In this research-in-progress paper, we report on our efforts to further develop the IS discipline’s understanding of affordances from informational objects. Specifically, we seek to extend extant theory on the origin and actualization of affordances. We develop a model that describes the process by which affordances are perceived and actualized and their dependence on information and actualization effort. We illustrate our emergent theory in the context of conceptual process models used by analysts for purposes of information systems analysis and design. We offer suggestions for operationalizing and testing this model empirically, and provide details about our design of a mixed-methods study currently in progress.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pesticides used in agricultural systems must be applied in economically viable and environmentally sensitive ways, and this often requires expensive field trials on spray deposition and retention by plant foliage. Computational models to describe whether a spray droplet sticks (adheres), bounces or shatters on impact, and if any rebounding parent or shatter daughter droplets are recaptured, would provide an estimate of spray retention and thereby act as a useful guide prior to any field trials. Parameter-driven interactive software has been implemented to enable the end-user to study and visualise droplet interception and impaction on a single, horizontal leaf. Living chenopodium, wheat and cotton leaves have been scanned to capture the surface topography and realistic virtual leaf surface models have been generated. Individual leaf models have then been subjected to virtual spray droplets and predictions made of droplet interception with the virtual plant leaf. Thereafter, the impaction behaviour of the droplets and the subsequent behaviour of any daughter droplets, up until re-capture, are simulated to give the predicted total spray retention by the leaf. A series of critical thresholds for the stick, bounce, and shatter elements in the impaction process have been developed for different combinations of formulation, droplet size and velocity, and leaf surface characteristics to provide this output. The results show that droplet properties, spray formulations and leaf surface characteristics all influence the predicted amount of spray retained on a horizontal leaf surface. Overall the predicted spray retention increases as formulation surface tension, static contact angle, droplet size and velocity decreases. Predicted retention on cotton is much higher than on chenopodium. The average predicted retention on a single horizontal leaf across all droplet size, velocity and formulations scenarios tested, is 18, 30 and 85% for chenopodium, wheat and cotton, respectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Apoptosis is the final destiny of many cells in the body, though this process has been observed in some pathological processes. One of these pathological processes is femoral head non-traumatic osteonecrosis. Among many pro/anti-apoptotic factors, nitric oxide has recently been an area of further interest. Osteocyte apoptosis and its relation to pro-apoptotic action invite further research, and the inducible form of nitric oxide synthase (iNOS)—which produces a high concentration of nitric oxide—has been flagged. The aim of this study was to investigate the effect of hyperbaric oxygen (HBO) and inducible NOS suppressor (Aminoguanidine) in prevention of femoral head osteonecrosis in an experimental model of osteonecrosis in spontaneous hypertensive rats (SHRs). Methods: After animal ethic approval 34 SHR rats were divided into four groups. Ten rats were allocated to the control group without any treatment, and eight rats were allocated to three treatment groups namely: HBO, Aminoguanidine (AMG), and the combination of HBO and AMG treatments (HBO+AMG). The HBO group received 250 kPa of oxygen via hyperbaric chamber for 30 days started at their 5th week of life; the AMG group received 1mg/ml of AMG in drinking water from the fifth week till the 17th week of life; and the last group received a combination of these treatments. Rats were sacrificed at the end of the 17th week of life and both femurs were analysed for evidence of osteonecrosis using Micro CT scan and H&E staining. Also, osteocyte apoptosis and the presence of two different forms of NOS (inducible (iNOS) and endothelial (eNOS)) were analysed by immunostaining and apoptosis staining (Hoechst and TUNEL). Results: Bone morphology of metaphyseal and epiphyseal area of all rats were investigated and analysed. Micro CT findings revealed significantly higher mean fractional trabecular bone volume (FBV) of metaphyseal area in untreated SHRs compared with all other treatments (HBO, P<0.05, HBO+AMG, P<0.005, and AMG P<0.001). Bone surface to volume ratio also significantly increased with HBO+AMG and AMG treatments when compared with the control group (18.7 Vs 20.8, P<0.05, and 18.7 Vs 21.1, P<0.05). Epiphyseal mean FBV did not change significantly among groups. In the metaphyseal area, trabecular thickness and numbers significantly decreased with AMG treatment, while trabecular separation significantly increased with both AMG and HBO+AMG treatment. Histological ratio of no ossification and osteonecrosis was 37.5%, 43.7%, 18.7% and 6.2% of control, HBO, HBO+AMG and AMG groups respectively with only significant difference observed between HBO and AMG treatment (P<0.01). High concentration of iNOS was observed in the region of osteonecrosis while there was no evidence of eNOS activity around that region. In comparison with the control group, the ratio of osteocyte apoptosis significantly reduced in AMG treatment (P<0.005). We also observed significantly fewer apoptotic osteocytes in AMG group comparing with HBO treatment (P<0.05). Conclusion: None of our treatments prevents osteonecrosis at the histological or micro CT scan level. High concentration of iNOS in the region of osteonecrosis and significant reduction of osteocyte apoptosis with AMG treatment were supportive of iNOS modulating osteocyte apoptosis in SHRs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A large subsurface, elevated temperature anomaly is well documented in Central Australia. High Heat Producing Granites (HHPGs) intersected by drilling at Innamincka are often assumed to be the dominant cause of the elevated subsurface temperatures, although their presence in other parts of the temperature anomaly has not been confirmed. Geological controls on the temperature anomaly remain poorly understood. Additionally, methods previously used to predict temperature at 5 km depth in this area are simplistic and possibly do not give an accurate representation of the true distribution and magnitude of the temperature anomaly. Here we re-evaluate the geological controls on geothermal potential in the Queensland part of the temperature anomaly using a stochastic thermal model. The results illustrate that the temperature distribution is most sensitive to the thermal conductivity structure of the top 5 km. Furthermore, the results indicate the presence of silicic crust enriched in heat producing elements between and 40 km.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increasing global competition, rapid technological changes, advances in manufacturing and information technology and discerning customers are forcing supply chains to adopt improvement practices that enable them to deliver high quality products at a lower cost and in a shorter period of time. A lean initiative is one of the most effective approaches toward achieving this goal. In the lean improvement process, it is critical to measure current and desired performance level in order to clearly evaluate the lean implementation efforts. Many attempts have tried to measure supply chain performance incorporating both quantitative and qualitative measures but failed to provide an effective method of measuring improvements in performances for dynamic lean supply chain situations. Therefore, the necessity of appropriate measurement of lean supply chain performance has become imperative. There are many lean tools available for supply chains; however, effectiveness of a lean tool depends on the type of the product and supply chain. One tool may be highly effective for a supply chain involved in high volume products but may not be effective for low volume products. There is currently no systematic methodology available for selecting appropriate lean strategies based on the type of supply chain and market strategy This thesis develops an effective method to measure the performance of supply chain consisting of both quantitative and qualitative metrics and investigates the effects of product types and lean tool selection on the supply chain performance Supply chain performance matrices and the effects of various lean tools over performance metrics mentioned in the SCOR framework have been investigated. A lean supply chain model based on the SCOR metric framework is then developed where non- lean and lean as well as quantitative and qualitative metrics are incorporated in appropriate metrics. The values of appropriate metrics are converted into triangular fuzzy numbers using similarity rules and heuristic methods. Data have been collected from an apparel manufacturing company for multiple supply chain products and then a fuzzy based method is applied to measure the performance improvements in supply chains. Using the fuzzy TOPSIS method, which chooses an optimum alternative to maximise similarities with positive ideal solutions and to minimise similarities with negative ideal solutions, the performances of lean and non- lean supply chain situations for three different apparel products have been evaluated. To address the research questions related to effective performance evaluation method and the effects of lean tools over different types of supply chains; a conceptual framework and two hypotheses are investigated. Empirical results show that implementation of lean tools have significant effects over performance improvements in terms of time, quality and flexibility. Fuzzy TOPSIS based method developed is able to integrate multiple supply chain matrices onto a single performance measure while lean supply chain model incorporates qualitative and quantitative metrics. It can therefore effectively measure the improvements for supply chain after implementing lean tools. It is demonstrated that product types involved in the supply chain and ability to select right lean tools have significant effect on lean supply chain performance. Future study can conduct multiple case studies in different contexts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Indirect inference (II) is a methodology for estimating the parameters of an intractable (generative) model on the basis of an alternative parametric (auxiliary) model that is both analytically and computationally easier to deal with. Such an approach has been well explored in the classical literature but has received substantially less attention in the Bayesian paradigm. The purpose of this paper is to compare and contrast a collection of what we call parametric Bayesian indirect inference (pBII) methods. One class of pBII methods uses approximate Bayesian computation (referred to here as ABC II) where the summary statistic is formed on the basis of the auxiliary model, using ideas from II. Another approach proposed in the literature, referred to here as parametric Bayesian indirect likelihood (pBIL), we show to be a fundamentally different approach to ABC II. We devise new theoretical results for pBIL to give extra insights into its behaviour and also its differences with ABC II. Furthermore, we examine in more detail the assumptions required to use each pBII method. The results, insights and comparisons developed in this paper are illustrated on simple examples and two other substantive applications. The first of the substantive examples involves performing inference for complex quantile distributions based on simulated data while the second is for estimating the parameters of a trivariate stochastic process describing the evolution of macroparasites within a host based on real data. We create a novel framework called Bayesian indirect likelihood (BIL) which encompasses pBII as well as general ABC methods so that the connections between the methods can be established.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is increasing momentum in cancer care to implement a two stage assessment process that accurately determines the ability of older patients to cope with, and benefit from, chemotherapy. The two-step approach aims to ensure that patients clearly fit for chemotherapy can be accurately identified and referred for treatment without undergoing a time- and resource-intensive comprehensive geriatric assessment (CGA). Ideally, this process removes the uncertainty of how to classify and then appropriately treat the older cancer patient. After trialling a two-stage screen and CGA process in the Division of Cancer Services at Princess Alexandra Hospital (PAH) in 2011-2012, we implemented a model of oncogeriatric care based on our findings. In this paper, we explore the methodological and practical aspects of implementing the PAH model and outline further work needed to refine the process in our treatment context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of mobile phones while driving is more prevalent among young drivers—a less experienced cohort with elevated crash risk. The objective of this study was to examine and better understand the reaction times of young drivers to a traffic event originating in their peripheral vision whilst engaged in a mobile phone conversation. The CARRS-Q Advanced Driving Simulator was used to test a sample of young drivers on various simulated driving tasks, including an event that originated within the driver’s peripheral vision, whereby a pedestrian enters a zebra crossing from a sidewalk. Thirty-two licensed drivers drove the simulator in three phone conditions: baseline (no phone conversation), hands-free and handheld. In addition to driving the simulator each participant completed questionnaires related to driver demographics, driving history, usage of mobile phones while driving, and general mobile phone usage history. The participants were 21 to 26 years old and split evenly by gender. Drivers’ reaction times to a pedestrian in the zebra crossing were modelled using a parametric accelerated failure time (AFT) duration model with a Weibull distribution. Also tested where two different model specifications to account for the structured heterogeneity arising from the repeated measures experimental design. The Weibull AFT model with gamma heterogeneity was found to be the best fitting model and identified four significant variables influencing the reaction times, including phone condition, driver’s age, license type (Provisional license holder or not), and self-reported frequency of usage of handheld phones while driving. The reaction times of drivers were more than 40% longer in the distracted condition compared to baseline (not distracted). Moreover, the impairment of reaction times due to mobile phone conversations was almost double for provisional compared to open license holders. A reduction in the ability to detect traffic events in the periphery whilst distracted presents a significant and measurable safety concern that will undoubtedly persist unless mitigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Construction works are project-based and interdisciplinary. Many construction management (CM) problems are ill defined. The knowledge required to address such problems is not readily available and mostly tacit in nature. Moreover, the researchers, especially the students in the higher education, often face difficulty in defining the research problem, adopting an appropriate research process and methodology for designing and validating their research. This paper describes a ‘Horseshoe’ research process approach and its application to address a research problem of extracting construction-relevant information from a building information model (BIM). It describes the different steps of the process for understanding a problem, formulating appropriate research question/s, defining different research tasks, including a methodology for developing, implementing and validating the research. It is argued that a structure research approach and the use of mixed research methods would provide a sound basis for research design and validation in order to make contribution to existing knowledge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose The purpose of this paper is to investigate the role of multiple actors in the value creation process for a preventative health service, and observe the subsequent impact on key service outcomes of satisfaction and customer behaviour intentions to use a preventative health service again in the future. Design/methodology/approach An online self-completion survey of Australian women (n=797) was conducted to test the proposed framework in the context of a free, government-provided breastscreening service. Data were analysed using Structural Equation Modelling (SEM). Findings The findings indicate that functional and emotional value are created from organisational and customer resources. These findings indicate that health service providers and customers are jointly responsible for the successful creation of value, leading to desirable outcomes for all stakeholders. Practical implications The results highlight to health professionals the aspects of service that can be managed in order to create value with target audiences. The findings also indicate the importance of the resources provided by users in the creation of value, signifying the importance of customer education and management. Originality/value This study provides a significant contribution to social marketing through the provision of an empirically validated model of value creation in a preventative health service. The model demonstrates how the creation and provision of value can lead to the achievement of desirable social behaviours - a key aim of social marketing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Asset service organisations often recognize asset management as a core competence to deliver benefits to their business. But how do organizations know whether their asset management processes are adequate? Asset management maturity models, which combine best practices and competencies, provide a useful approach to test the capacity of organisations to manage their assets. Asset management frameworks are required to meet the dynamic challenges of managing assets in contemporary society. Although existing models are subject to wide variations in their implementation and sophistication, they also display a distinct weakness in that they tend to focus primarily on the operational and technical level and neglect the levels of strategy, policy and governance as well as the social and human resources – the people elements. Moreover, asset management maturity models have to respond to the external environmental factors, including such as climate change and sustainability, stakeholders and community demand management. Drawing on five dimensions of effective asset management – spatial, temporal, organisational, statistical, and evaluation – as identified by Amadi Echendu et al. [1], this paper carries out a comprehensive comparative analysis of six existing maturity models to identify the gaps in key process areas. Results suggest incorporating these into an integrated approach to assess the maturity of asset-intensive organizations. It is contended that the adoption of an integrated asset management maturity model will enhance effective and efficient delivery of services.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, there has been a significant increase in the popularity of ontological analysis of conceptual modelling techniques. To date, related research explores the ontological deficiencies of classical techniques such as ER or UML modelling, as well as business process modelling techniques such as ARIS or even Web Services standards such as BPEL4WS, BPML, ebXML, BPSS and WSCI. While the ontologies that form the basis of these analyses are reasonably mature, it is the actual process of an ontological analysis that still lacks rigour. The current procedure is prone to individual interpretations and is one reason for criticism of the entire ontological analysis. This paper presents a procedural model for ontological analysis based on the use of meta models, multiple coders and metrics. The model is supported by examples from various ontological analyses.