936 resultados para Process analysis
Resumo:
Flexible information exchange is critical to successful design integration, but current top-down, standards-based and model-oriented strategies impose restrictions that are contradictory to this flexibility. In this paper we present a bottom-up, user-controlled and process-oriented approach to linking design and analysis applications that is more responsive to the varied needs of designers and design teams. Drawing on research into scientific workflows, we present a framework for integration that capitalises on advances in cloud computing to connect discrete tools via flexible and distributed process networks. Adopting a services-oriented system architecture, we propose a web-based platform that enables data, semantics and models to be shared on the fly. We discuss potential challenges and opportunities for the development thereof as a flexible, visual, collaborative, scalable and open system.
Resumo:
This study draws on communication accommodation theory, social identity theory and cognitive dissonance theory to drive a ‘Citizen’s Round Table’ process that engages community audiences on energy technologies and strategies that potentially mitigate climate change. The study examines the effectiveness of the process in determining the strategies that engage people in discussion. The process is designed to canvas participants’ perspectives and potential reactions to the array of renewable and non-renewable energy sources, in particular, underground storage of CO2. Ninety-five people (12 groups) participated in the process. Questionnaires were administered three times to identify changes in attitudes over time, and analysis of video, audio-transcripts and observer notes enabled an evaluation of level of engagement and communication among participants. The key findings of this study indicate that the public can be meaningfully engaged in discussion on the politically sensitive issue of CO2 capture and storage (CCS) and other low emission technologies. The round table process was critical to participants’ engagement and led to attitude change towards some methods of energy production. This study identifies a process that can be used successfully to explore community attitudes on politically-sensitive topics and encourages an examination of attitudes and potential attitude change.
Resumo:
Power system operation and planning are facing increasing uncertainties especially with the deregulation process and increasing demand for power. Probabilistic power system stability assessment and probabilistic power system planning have been identified by EPRI as one of the important trends in power system operations and planning. Probabilistic small signal stability assessment studies the impact of system parameter uncertainties on system small disturbance stability characteristics. Researches in this area have covered many uncertainties factors such as controller parameter uncertainties and generation uncertainties. One of the most important factors in power system stability assessment is load dynamics. In this paper, composite load model is used to consider the uncertainties from load parameter uncertainties impact on system small signal stability characteristics. The results provide useful insight into the significant stability impact brought to the system by load dynamics. They can be used to help system operators in system operation and planning analysis.
Resumo:
Reliable pollutant build-up prediction plays a critical role in the accuracy of urban stormwater quality modelling outcomes. However, water quality data collection is resource demanding compared to streamflow data monitoring, where a greater quantity of data is generally available. Consequently, available water quality data sets span only relatively short time scales unlike water quantity data. Therefore, the ability to take due consideration of the variability associated with pollutant processes and natural phenomena is constrained. This in turn gives rise to uncertainty in the modelling outcomes as research has shown that pollutant loadings on catchment surfaces and rainfall within an area can vary considerably over space and time scales. Therefore, the assessment of model uncertainty is an essential element of informed decision making in urban stormwater management. This paper presents the application of a range of regression approaches such as ordinary least squares regression, weighted least squares Regression and Bayesian Weighted Least Squares Regression for the estimation of uncertainty associated with pollutant build-up prediction using limited data sets. The study outcomes confirmed that the use of ordinary least squares regression with fixed model inputs and limited observational data may not provide realistic estimates. The stochastic nature of the dependent and independent variables need to be taken into consideration in pollutant build-up prediction. It was found that the use of the Bayesian approach along with the Monte Carlo simulation technique provides a powerful tool, which attempts to make the best use of the available knowledge in the prediction and thereby presents a practical solution to counteract the limitations which are otherwise imposed on water quality modelling.
Resumo:
The rapid growth of services available on the Internet and exploited through ever globalizing business networks poses new challenges for service interoperability. New services, from consumer “apps”, enterprise suites, platform and infrastructure resources, are vying for demand with quickly evolving and overlapping capabilities, and shorter cycles of extending service access from user interfaces to software interfaces. Services, drawn from a wider global setting, are subject to greater change and heterogeneity, demanding new requirements for structural and behavioral interface adaptation. In this paper, we analyze service interoperability scenarios in global business networks, and propose new patterns for service interactions, above those proposed over the last 10 years through the development of Web service standards and process choreography languages. By contrast, we reduce assumptions of design-time knowledge required to adapt services, giving way to run-time mismatch resolutions, extend the focus from bilateral to multilateral messaging interactions, and propose declarative ways in which services and interactions take part in long-running conversations via the explicit use of state.
Resumo:
Background: The 30-item USDI is a self-report measure that assesses depressive symptoms among university students. It consists of three correlated three factors: Lethargy, Cognitive-Emotional and Academic motivation. The current research used confirmatory factor analysis to asses construct validity and determine whether the original factor structure would be replicated in a different sample. Psychometric properties were also examined. Method: Participants were 1148 students (mean age 22.84 years, SD = 6.85) across all faculties from a large Australian metropolitan university. Students completed a questionnaire comprising of the USDI, the Depression Anxiety Stress Scale (DASS) and Life Satisfaction Scale (LSS). Results: The three correlated factor model was shown to be an acceptable fit to the data, indicating sound construct validity. Internal consistency of the scale was also demonstrated to be sound, with high Cronbach Alpha values. Temporal stability of the scale was also shown to be strong through test-retest analysis. Finally, concurrent and discriminant validity was examined with correlations between the USDI and DASS subscales as well as the LSS, with sound results contributing to further support the construct validity of the scale. Cut-off points were also developed to aid total score interpretation. Limitations: Response rates are unclear. In addition, the representativeness of the sample could be improved potentially through targeted recruitment (i.e. reviewing the online sample statistics during data collection, examining the representativeness trends and addressing particular faculties within the university that were underrepresented). Conclusions: The USDI provides a valid and reliable method of assessing depressive symptoms found among university students.
Resumo:
Encompasses the whole BPM lifecycle, including process identification, modelling, analysis, redesign, automation and monitoring Class-tested textbook complemented with additional teaching material on the accompanying website Covers both relevant conceptual background, industrial standards and actionable skills Business Process Management (BPM) is the art and science of how work should be performed in an organization in order to ensure consistent outputs and to take advantage of improvement opportunities, e.g. reducing costs, execution times or error rates. Importantly, BPM is not about improving the way individual activities are performed, but rather about managing entire chains of events, activities and decisions that ultimately produce added value for an organization and its customers. This textbook encompasses the entire BPM lifecycle, from process identification to process monitoring, covering along the way process modelling, analysis, redesign and automation. Concepts, methods and tools from business management, computer science and industrial engineering are blended into one comprehensive and inter-disciplinary approach. The presentation is illustrated using the BPMN industry standard defined by the Object Management Group and widely endorsed by practitioners and vendors worldwide. In addition to explaining the relevant conceptual background, the book provides dozens of examples, more than 100 hands-on exercises – many with solutions – as well as numerous suggestions for further reading. The textbook is the result of many years of combined teaching experience of the authors, both at the undergraduate and graduate levels as well as in the context of professional training. Students and professionals from both business management and computer science will benefit from the step-by-step style of the textbook and its focus on fundamental concepts and proven methods. Lecturers will appreciate the class-tested format and the additional teaching material available on the accompanying website fundamentals-of-bpm.org.
Resumo:
Recent literature has argued that environmental efficiency (EE), which is built on the materials balance (MB) principle, is more suitable than other EE measures in situations where the law of mass conversation regulates production processes. In addition, the MB-based EE method is particularly useful in analysing possible trade-offs between cost and environmental performance. Identifying determinants of MB-based EE can provide useful information to decision makers but there are very few empirical investigations into this issue. This article proposes the use of data envelopment analysis and stochastic frontier analysis techniques to analyse variation in MB-based EE. Specifically, the article develops a stochastic nutrient frontier and nutrient inefficiency model to analyse determinants of MB-based EE. The empirical study applies both techniques to investigate MB-based EE of 96 rice farms in South Korea. The size of land, fertiliser consumption intensity, cost allocative efficiency, and the share of owned land out of total land are found to be correlated with MB-based EE. The results confirm the presence of a trade-off between MB-based EE and cost allocative efficiency and this finding, favouring policy interventions to help farms simultaneously achieve cost efficiency and MP-based EE.
Resumo:
Recent literature has argued that environmental efficiency (EE), which is built on the materials balance (MB) principle, is more suitable than other EE measures in situations where the law of mass conversation regulates production processes. In addition, the MB-based EE method is particularly useful in analysing possible trade-offs between cost and environmental performance. Identifying determinants of MB-based EE can provide useful information to decision makers but there are very few empirical investigations into this issue. This article proposes the use of data envelopment analysis and stochastic frontier analysis techniques to analyse variation in MB-based EE. Specifically, the article develops a stochastic nutrient frontier and nutrient inefficiency model to analyse determinants of MB-based EE. The empirical study applies both techniques to investigate MB-based EE of 96 rice farms in South Korea. The size of land, fertiliser consumption intensity, cost allocative efficiency, and the share of owned land out of total land are found to be correlated with MB-based EE. The results confirm the presence of a trade-off between MB-based EE and cost allocative efficiency and this finding, favouring policy interventions to help farms simultaneously achieve cost efficiency and MP-based EE.
Resumo:
This research examines the entrepreneurship phenomenon, and the question: Why are some venture attempts more successful than others? This question is not a new one. Prior research has answered this by describing those that engage in nascent entrepreneurship. Yet, this approach yielded little consensus and offers little comfort for those newly considering venture creation (Gartner, 1988). Rather, this research considers the process of venture creation, by focusing on the actions of nascent entrepreneurs. However, the venture creation process is complex (Liao, Welsch, & Tan, 2005), and multi-dimensional (Davidsson, 2004). The process can vary in the amount of action engaged by the entrepreneur; the temporal dynamics of how action is enacted (Lichtenstein, Carter, Dooley, and Gartner 2007); or the sequence in which actions are undertaken. And little is known about whether any, or all three, of these dimensions matter. Further, there exists scant general knowledge about how the venture creation process influences venture creation outcomes (Gartner & Shaver, 2011). Therefore, this research conducts a systematic study of what entrepreneurs do as they create a new venture. The primary goal is to develop general principles so that advice may be offered on how to ‘proceed’, rather than how to ‘be’. Three integrated empirical studies were conducted that separately focus on each of the interrelated dimensions. The basis for this was a randomly sampled, longitudinal panel, of nascent ventures. Upon recruitment these ventures were in the process of being created, but yet to be established as new businesses. The ventures were tracked one year latter to follow up on outcomes. Accordingly, this research makes the following original contributions to knowledge. First, the findings suggest that all three of the dimensions play an important role: action, dynamics, and sequence. This implies that future research should take a multi-dimensional view of the venture creation process. Failing to do so can only result in a limited understanding of a complex phenomenon. Second, action is the fundamental means through which venture creation is achieved. Simply put, more active venture creation efforts are more likely more successful. Further, action is the medium which allows resource endowments their effect upon venture outcomes. Third, the dynamics of how venture creation plays out over time is also influential. Here, a process with a high rate of action which increases in intensity will more likely achieve positive outcomes. Forth, sequence analysis, suggests that the order in which actions are taken will also drive outcomes. Although venture creation generally flows in sequence from discovery toward exploitation (Shane & Venkataraman, 2000; Eckhardt & Shane, 2003; Shane, 2003), processes that actually proceed in this way are less likely to be realized. Instead, processes which specifically intertwine discovery and exploitation action together in symbiosis more likely achieve better outcomes (Sarasvathy, 2001; Baker, Miner, & Eesley, 2003). Further, an optimal venture creation order exists somewhere between these sequential and symbiotic process archetypes. A process which starts out as symbiotic discovery and exploitation, but switches to focus exclusively on exploitation later on is most likely to achieve venture creation. These sequence findings are unique, and suggest future integration between opposing theories for order in venture creation.
Resumo:
This work has led to the development of empirical mathematical models to quantitatively predicate the changes of morphology in osteocyte-like cell lines (MLO-Y4) in culture. MLO-Y4 cells were cultured at low density and the changes in morphology recorded over 11 hours. Cell area and three dimensional shape features including aspect ratio, circularity and solidity were then determined using widely accepted image analysis software (ImageJTM). Based on the data obtained from the imaging analysis, mathematical models were developed using the non-linear regression method. The developed mathematical models accurately predict the morphology of MLO-Y4 cells for different culture times and can, therefore, be used as a reference model for analyzing MLO-Y4 cell morphology changes within various biological/mechanical studies, as necessary.
Resumo:
Theoretical foundations of higher order spectral analysis are revisited to examine the use of time-varying bicoherence on non-stationary signals using a classical short-time Fourier approach. A methodology is developed to apply this to evoked EEG responses where a stimulus-locked time reference is available. Short-time windowed ensembles of the response at the same offset from the reference are considered as ergodic cyclostationary processes within a non-stationary random process. Bicoherence can be estimated reliably with known levels at which it is significantly different from zero and can be tracked as a function of offset from the stimulus. When this methodology is applied to multi-channel EEG, it is possible to obtain information about phase synchronization at different regions of the brain as the neural response develops. The methodology is applied to analyze evoked EEG response to flash visual stimulii to the left and right eye separately. The EEG electrode array is segmented based on bicoherence evolution with time using the mean absolute difference as a measure of dissimilarity. Segment maps confirm the importance of the occipital region in visual processing and demonstrate a link between the frontal and occipital regions during the response. Maps are constructed using bicoherence at bifrequencies that include the alpha band frequency of 8Hz as well as 4 and 20Hz. Differences are observed between responses from the left eye and the right eye, and also between subjects. The methodology shows potential as a neurological functional imaging technique that can be further developed for diagnosis and monitoring using scalp EEG which is less invasive and less expensive than magnetic resonance imaging.
Resumo:
This paper describes an innovative platform that facilitates the collection of objective safety data around occurrences at railway level crossings using data sources including forward-facing video, telemetry from trains and geo-referenced asset and survey data. This platform is being developed with support by the Australian rail industry and the Cooperative Research Centre for Rail Innovation. The paper provides a description of the underlying accident causation model, the development methodology and refinement process as well as a description of the data collection platform. The paper concludes with a brief discussion of benefits this project is expected to provide the Australian rail industry.
Resumo:
There are different ways to authenticate humans, which is an essential prerequisite for access control. The authentication process can be subdivided into three categories that rely on something someone i) knows (e.g. password), and/or ii) has (e.g. smart card), and/or iii) is (biometric features). Besides classical attacks on password solutions and the risk that identity-related objects can be stolen, traditional biometric solutions have their own disadvantages such as the requirement of expensive devices, risk of stolen bio-templates etc. Moreover, existing approaches provide the authentication process usually performed only once initially. Non-intrusive and continuous monitoring of user activities emerges as promising solution in hardening authentication process: iii-2) how so. behaves. In recent years various keystroke dynamic behavior-based approaches were published that are able to authenticate humans based on their typing behavior. The majority focuses on so-called static text approaches, where users are requested to type a previously defined text. Relatively few techniques are based on free text approaches that allow a transparent monitoring of user activities and provide continuous verification. Unfortunately only few solutions are deployable in application environments under realistic conditions. Unsolved problems are for instance scalability problems, high response times and error rates. The aim of this work is the development of behavioral-based verification solutions. Our main requirement is to deploy these solutions under realistic conditions within existing environments in order to enable a transparent and free text based continuous verification of active users with low error rates and response times.
Resumo:
Research on Enterprise Resource Planning (ERP) Systems is becoming a well-established research theme in Information Systems (IS) research. Enterprise Resource Planning Systems, given its unique differentiations with other IS applications, have provided an interesting backdrop to test and re-test some of the key and fundamental concepts in IS. While some researchers have tested well-established concepts of technology acceptance, system usage and system success in the context of ERP Systems, others have researched how new paradigms like cloud computing and social media integrate with ERP Systems. Moreover, ERP Systems provided the context for cross disciplinary research such as knowledge management, project management and business process management research. Almost after two-decades since its inception in IS research, this paper provides a critique of 198 papers published on ERP Systems since 2006-2012. We observe patterns on ES research, provide comparisons to past studies and provide future research directions.