936 resultados para Process analysis
Resumo:
This paper investigates relationship between traffic conditions and the crash occurrence likelihood (COL) using the I-880 data. To remedy the data limitations and the methodological shortcomings suffered by previous studies, a multiresolution data processing method is proposed and implemented, upon which binary logistic models were developed. The major findings of this paper are: 1) traffic conditions have significant impacts on COL at the study site; Specifically, COL in a congested (transitioning) traffic flow is about 6 (1.6) times of that in a free flow condition; 2)Speed variance alone is not sufficient to capture traffic dynamics’ impact on COL; a traffic chaos indicator that integrates speed, speed variance, and flow is proposed and shows a promising performance; 3) Models based on aggregated data shall be interpreted with caution. Generally, conclusions obtained from such models shall not be generalized to individual vehicles (drivers) without further evidences using high-resolution data and it is dubious to either claim or disclaim speed kills based on aggregated data.
Resumo:
Most crash severity studies ignored severity correlations between driver-vehicle units involved in the same crashes. Models without accounting for these within-crash correlations will result in biased estimates in the factor effects. This study developed a Bayesian hierarchical binomial logistic model to identify the significant factors affecting the severity level of driver injury and vehicle damage in traffic crashes at signalized intersections. Crash data in Singapore were employed to calibrate the model. Model fitness assessment and comparison using Intra-class Correlation Coefficient (ICC) and Deviance Information Criterion (DIC) ensured the suitability of introducing the crash-level random effects. Crashes occurring in peak time, in good street lighting condition, involving pedestrian injuries are associated with a lower severity, while those in night time, at T/Y type intersections, on right-most lane, and installed with red light camera have larger odds of being severe. Moreover, heavy vehicles have a better resistance on severe crash, while crashes involving two-wheel vehicles, young or aged drivers, and the involvement of offending party are more likely to result in severe injuries.
Resumo:
Denaturation of tissues can provide a unique biological environment for regenerative medicine application only if minimal disruption of their microarchitecture is achieved during the decellularization process. The goal is to keep the structural integrity of such a construct as functional as the tissues from which they were derived. In this work, cartilage-on-bone laminates were decellularized through enzymatic, non-ionic and ionic protocols. This work investigated the effects of decellularization process on the microarchitecture of cartiligous extracellular matrix; determining the extent of how each process deteriorated the structural organization of the network. High resolution microscopy was used to capture cross-sectional images of samples prior to and after treatment. The variation of the microarchitecture was then analysed using a well defined fast Fourier image processing algorithm. Statistical analysis of the results revealed how significant the alternations among aforementioned protocols were (p < 0.05). Ranking the treatments by their effectiveness in disrupting the ECM integrity, they were ordered as: Trypsin> SDS> Triton X-100.
Resumo:
Most approaches to business process compliance are restricted to the analysis of the structure of processes. It has been argued that full regulatory compliance requires information on not only the structure of processes but also on what the tasks in a process do. To this end Governatori and Sadiq[2007] proposed to extend business processes with semantic annotations. We propose a methodology to automatically extract one kind of such annotations; in particular the annotations related to the data schema and templates linked to the various tasks in a business process.
Resumo:
In this paper, a class of fractional advection–dispersion models (FADMs) is considered. These models include five fractional advection–dispersion models, i.e., the time FADM, the mobile/immobile time FADM with a time Caputo fractional derivative 0 < γ < 1, the space FADM with two sides Riemann–Liouville derivatives, the time–space FADM and the time fractional advection–diffusion-wave model with damping with index 1 < γ < 2. These equations can be used to simulate the regional-scale anomalous dispersion with heavy tails. We propose computationally effective implicit numerical methods for these FADMs. The stability and convergence of the implicit numerical methods are analysed and compared systematically. Finally, some results are given to demonstrate the effectiveness of theoretical analysis.
Resumo:
Data mining techniques extract repeated and useful patterns from a large data set that in turn are utilized to predict the outcome of future events. The main purpose of the research presented in this paper is to investigate data mining strategies and develop an efficient framework for multi-attribute project information analysis to predict the performance of construction projects. The research team first reviewed existing data mining algorithms, applied them to systematically analyze a large project data set collected by the survey, and finally proposed a data-mining-based decision support framework for project performance prediction. To evaluate the potential of the framework, a case study was conducted using data collected from 139 capital projects and analyzed the relationship between use of information technology and project cost performance. The study results showed that the proposed framework has potential to promote fast, easy to use, interpretable, and accurate project data analysis.
Resumo:
Purpose – In the context of global knowledge economy, knowledge-based urban development (KBUD) is seen as an effective development strategy for city-regions to survive, flourish and become highly competitive urban agglomerations – i.e., a knowledge city-region. This paper aims to evaluate the KBUD dynamics, capacity and potentials of a rapidly emerging knowledge city-region of Finland – Tampere region. Design/methodology/approach – The paper undertakes a review of the literature on regional development in the knowledge economy era. It adopts a qualitative analysis technique to scrutinize the dynamics, capacity and potentials of Tampere region. The semi-structured interview process starts with the pre-determined key actors of the city-region with an aim of determining the other key players. Next, with the participation of all key players to the interviews, the research reveals the principal issues, assets and mechanisms that relate to KBUD, and portrays the strengths, weaknesses, opportunities and threats of the city-region. A critical analysis of the findings along with the previous studies is undertaken to provide a clear picture of the dynamics, capacity and potentials of the emerging knowledge city-region. Originality/value – This paper reports the findings of a pioneering study focusing on the investigation of the KBUD dynamics, capacity and potentials of Tampere region. The paper critically evaluates the city-region from the knowledge perspective with the lens of KBUD, and the lessons learned and the methodological approach of the paper shed light to other city-regions seeking such development. Practical implications – The paper discusses the findings of a study from Tampere region that critically scrutinizes the KBUD experience of the city-region. The research provides an invaluable opportunity to inform the regional decision-, policy- and plan-making mechanisms by determining key issues, actors, assets, processes and potential development directions for the KBUD of Tampere region.
Resumo:
Business process management (BPM) is becoming the dominant management paradigm. Business process modelling is central to BPM, and the resultant business process model the core artefact guiding subsequent process change. Thus, model quality is at the centre, mediating between the modelling effort and related growing investment in ultimate process improvements. Nonetheless, though research interest in the properties that differentiate high quality process models is longstanding, there have been no past reports of a valid, operationalised, holistic measure of business process model quality. In attention to this gap, this paper reports validation of a Business Process Model Quality measurement model, conceptualised as a single-order, formative index. Such a measurement model has value as the dependent variable in rigorously researching the drivers of model quality; as antecedent of ultimate process improvements; and potentially as an economical comparator and diagnostic for practice.
Resumo:
Peeling is an essential phase of post harvesting and processing industry; however undesirable processing losses are unavoidable and always have been the main concern of food processing sector. There are three methods of peeling fruits and vegetables including mechanical, chemical and thermal, depending on the class and type of fruit. By comparison, the mechanical methods are the most preferred; mechanical peeling methods do not create any harmful effects on the tissue and they keep edible portions of produce fresh. The main disadvantage of mechanical peeling is the rate of material loss and deformations. Obviously reducing material losses and increasing the quality of the process has a direct effect on the whole efficiency of food processing industry, this needs more study on technological aspects of these operations. In order to enhance the effectiveness of food industrial practices it is essential to have a clear understanding of material properties and behaviour of tissues under industrial processes. This paper presents the scheme of research that seeks to examine tissue damage of tough skinned vegetables under mechanical peeling process by developing a novel FE model of the process using explicit dynamic finite element analysis approach. A computer model of mechanical peeling process will be developed in this study to stimulate the energy consumption and stress strain interactions of cutter and tissue. The available Finite Element softwares and methods will be applied to establish the model. Improving the knowledge of interactions and involves variables in food operation particularly in peeling process is the main objectives of the proposed study. Understanding of these interrelationships will help researchers and designer of food processing equipments to develop new and more efficient technologies. Presented work intends to review available literature and previous works has been done in this area of research and identify current gap in modelling and simulation of food processes.
Resumo:
Reducing complexity in Information Systems is a main concern in both research and industry. One strategy for reducing complexity is separation of concerns. This strategy advocates separating various concerns, like security and privacy, from the main concern. It results in less complex, easily maintainable, and more reusable Information Systems. Separation of concerns is addressed through the Aspect Oriented paradigm. This paradigm has been well researched and implemented in programming, where languages such as AspectJ have been developed. However, the rsearch on aspect orientation for Business Process Management is still at its beginning. While some efforts have been made proposing Aspect Oriented Business Process Modelling, it has not yet been investigated how to enact such process models in a Workflow Management System. In this paper, we define a set of requirements that specifies the execution of aspect oriented business process models. We create a Coloured Petri Net specification for the semantics of so-called Aspect Service that fulfils these requirements. Such a service extends the capability of a Workflow Management System with support for execution of aspect oriented business process models. The design specification of the Aspect Service is also inspected through state space analysis.
Resumo:
Non-profit organisations by their very nature are staffed by a variety of different people with a range of backgrounds, experiences and reasons for participation. These differences can lead to “distancing” of certain groups and with little time or money for boundary spanning the organisation can find itself in a fractured state that hampers not just its goal realisation, but its goal determination. Strategic planning is often seen as an expensive, time consuming process that many smaller non-profit organisations can little afford to indulge in. In addition, the ruling elite, whether historical or professional may view the process as unnecessary or threatening. However, strategic planning can offer processes and potential outcomes that non profit organisations can not afford to ignore. This paper provides an analysis through one case study involving a non-profit, health related organisation that moved through a process of strategic planning that ultimately encouraged development and group cohesion through goal identification and determination as well as strategy formulation. The results indicate the importance of valuing the strategic planning process itself rather than the form it takes. Challenging the rulership of the historical or professional elite can be difficult in a non-profit organisation, but diversity of involvement rather than uniformity proved to be a successful strategy. Organisational cohesion through consensus building was the ultimate outcome.
Resumo:
A breaker restrike is an abnormal arcing phenomenon, leading to a possible breaker failure. Eventually, this failure leads to interruption of the transmission and distribution of the electricity supply system until the breaker is replaced. Before 2008, there was little evidence in the literature of monitoring techniques based on restrike measurement and interpretation produced during switching of capacitor banks and shunt reactor banks in power systems. In 2008 a non-intrusive radiometric restrike measurement method and a restrike hardware detection algorithm were developed by M.S. Ramli and B. Kasztenny. However, the limitations of the radiometric measurement method are a band limited frequency response as well as limitations in amplitude determination. Current restrike detection methods and algorithms require the use of wide bandwidth current transformers and high voltage dividers. A restrike switch model using Alternative Transient Program (ATP) and Wavelet Transforms which support diagnostics are proposed. Restrike phenomena become a new diagnostic process using measurements, ATP and Wavelet Transforms for online interrupter monitoring. This research project investigates the restrike switch model Parameter „A. dielectric voltage gradient related to a normal and slowed case of the contact opening velocity and the escalation voltages, which can be used as a diagnostic tool for a vacuum circuit-breaker (CB) at service voltages between 11 kV and 63 kV. During current interruption of an inductive load at current quenching or chopping, a transient voltage is developed across the contact gap. The dielectric strength of the gap should rise to a point to withstand this transient voltage. If it does not, the gap will flash over, resulting in a restrike. A straight line is fitted through the voltage points at flashover of the contact gap. This is the point at which the gap voltage has reached a value that exceeds the dielectric strength of the gap. This research shows that a change in opening contact velocity of the vacuum CB produces a corresponding change in the slope of the gap escalation voltage envelope. To investigate the diagnostic process, an ATP restrike switch model was modified with contact opening velocity computation for restrike waveform signature analyses along with experimental investigations. This also enhanced a mathematical CB model with the empirical dielectric model for SF6 (sulphur hexa-fluoride) CBs at service voltages above 63 kV and a generalised dielectric curve model for 12 kV CBs. A CB restrike can be predicted if there is a similar type of restrike waveform signatures for measured and simulated waveforms. The restrike switch model applications are used for: computer simulations as virtual experiments, including predicting breaker restrikes; estimating the interrupter remaining life of SF6 puffer CBs; checking system stresses; assessing point-on-wave (POW) operations; and for a restrike detection algorithm development using Wavelet Transforms. A simulated high frequency nozzle current magnitude was applied to an Equation (derived from the literature) which can calculate the life extension of the interrupter of a SF6 high voltage CB. The restrike waveform signatures for a medium and high voltage CB identify its possible failure mechanism such as delayed opening, degraded dielectric strength and improper contact travel. The simulated and measured restrike waveform signatures are analysed using Matlab software for automatic detection. Experimental investigation of a 12 kV vacuum CB diagnostic was carried out for the parameter determination and a passive antenna calibration was also successfully developed with applications for field implementation. The degradation features were also evaluated with a predictive interpretation technique from the experiments, and the subsequent simulation indicates that the drop in voltage related to the slow opening velocity mechanism measurement to give a degree of contact degradation. A predictive interpretation technique is a computer modeling for assessing switching device performance, which allows one to vary a single parameter at a time; this is often difficult to do experimentally because of the variable contact opening velocity. The significance of this thesis outcome is that it is a non-intrusive method developed using measurements, ATP and Wavelet Transforms to predict and interpret a breaker restrike risk. The measurements on high voltage circuit-breakers can identify degradation that can interrupt the distribution and transmission of an electricity supply system. It is hoped that the techniques for the monitoring of restrike phenomena developed by this research will form part of a diagnostic process that will be valuable for detecting breaker stresses relating to the interrupter lifetime. Suggestions for future research, including a field implementation proposal to validate the restrike switch model for ATP system studies and the hot dielectric strength curve model for SF6 CBs, are given in Appendix A.
Resumo:
The concept of Six Sigma was initiated in the 1980s by Motorola. Since then it has been implemented in several manufacturing and service organizations. Till now Six Sigma implementation is mostly limited to healthcare and financial services in private sector. Its implementation is now gradually picking up in services such as call center, education, construction and related engineering etc. in private as well as public sector. Through a literature review, a questionnaire survey, and multiple case study approach the paper develops a conceptual framework to facilitate widening the scope of Six Sigma implementation in service organizations. Using grounded theory methodology, this study develops theory for Six Sigma implementation in service organizations. The study involves a questionnaire survey and case studies to understand and build a conceptual framework. The survey was conducted in service organizations in Singapore and exploratory in nature. The case studies involved three service organizations which implemented Six Sigma. The objective is to explore and understand the issues highlighted by the survey and the literature. The findings confirm the inclusion of critical success factors, critical-to-quality characteristics, and set of tools and techniques as observed from the literature. In case of key performance indicator, there are different interpretations about it in literature and also by industry practitioners. Some literature explain key performance indicator as performance metrics whereas some feel it as key process input or output variables, which is similar to interpretations by practitioners of Six Sigma. The response of not relevant and unknown to us as reasons for not implementing Six Sigma shows the need for understanding specific requirements of service organizations. Though much theoretical description is available about Six Sigma, but there has been limited rigorous academic research on it. This gap is far more pronounced about Six Sigma implementation in service organizations, where the theory is not mature enough. Identifying this need, the study contributes by going through theory building exercise and developing a conceptual framework to understand the issues involving its implementation in service organizations.